<%BANNER%>

Simulation for Autonomous Systems

Permanent Link: http://ufdc.ufl.edu/UFE0024127/00001

Material Information

Title: Simulation for Autonomous Systems
Physical Description: 1 online resource (56 p.)
Language: english
Creator: Shastri, Milind
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, M.S.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: An autonomous vehicle in a typical urban environment must maneuver in a city environment, merge into moving traffic, navigate traffic circles, negotiate busy intersections, and avoid obstacles. To test each of these behaviors, it is beneficial if the autonomous vehicle sensing and control algorithms can be tested in a simulated environment similar to the actual urban environment. In such a scenario, computer simulations help reduce testing costs, increase the ease and frequency of tests, put test subjects through extreme conditions without fear of harming other objects and enable quick changes in the system. We developed simulation models of a physical urban vehicle with LADAR and camera sensors and a typical urban environment, which is built on the Microsoft Robotics Developer Studio framework. The models were developed as a simulation of the actual autonomous vehicle of team Gator Nation and a part of the environment for the DARPA Urban Challenge 2007.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Milind Shastri.
Thesis: Thesis (M.S.)--University of Florida, 2008.
Local: Adviser: Crane, Carl D.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0024127:00001

Permanent Link: http://ufdc.ufl.edu/UFE0024127/00001

Material Information

Title: Simulation for Autonomous Systems
Physical Description: 1 online resource (56 p.)
Language: english
Creator: Shastri, Milind
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2008

Subjects

Subjects / Keywords: Mechanical and Aerospace Engineering -- Dissertations, Academic -- UF
Genre: Mechanical Engineering thesis, M.S.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: An autonomous vehicle in a typical urban environment must maneuver in a city environment, merge into moving traffic, navigate traffic circles, negotiate busy intersections, and avoid obstacles. To test each of these behaviors, it is beneficial if the autonomous vehicle sensing and control algorithms can be tested in a simulated environment similar to the actual urban environment. In such a scenario, computer simulations help reduce testing costs, increase the ease and frequency of tests, put test subjects through extreme conditions without fear of harming other objects and enable quick changes in the system. We developed simulation models of a physical urban vehicle with LADAR and camera sensors and a typical urban environment, which is built on the Microsoft Robotics Developer Studio framework. The models were developed as a simulation of the actual autonomous vehicle of team Gator Nation and a part of the environment for the DARPA Urban Challenge 2007.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Milind Shastri.
Thesis: Thesis (M.S.)--University of Florida, 2008.
Local: Adviser: Crane, Carl D.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2008
System ID: UFE0024127:00001


This item has the following downloads:


Full Text

PAGE 12

Introduction Problems with Testing i n a Real Environment

PAGE 13

Why the Simulation Solves the Problems

PAGE 20

Subjugator

PAGE 21

3DRAD Simulink E nvironment Maze Simulator

PAGE 22

National Institute of Standards and Technology (NIST) Urban Search and Rescue Arenas Princeton University Entry for Defense Advanced Research Projects Agency (DARPA ) Urban Challenge

PAGE 29

I ntroduction to Mi crosoft R obotics D evelopers S tudio (MRDS) The MRDS Features Used in the Project

PAGE 30

Partnership W ith a Physics Engine Ability to Interface With Input Devices Xinput Multi -Threading Environment

PAGE 31

Usage of Code and Concepts from S ample Programs Simulation C# environment : Easy Interfacing with J oint A rchitecture for U nmanned S ystems (JAUS)

PAGE 35

Sensors void CreateDefaultState() { _state.Units = sicklrf.Units .Millimeters; _state.AngularRange = 90; _state.AngularResolution = 0.5f; } The simulated laser range finder is attached to the simulated car by inserting it as a child of the car entity as shown below

PAGE 36

La serRangeFinderEntity vehicleLRF1 = CreateLRF(new Vector3(0.6f, 1.0f, 1.0f), "LRF1"); LaserRangeFinderEntity vehicleLRF2 = CreateLRF(new Vector3(0.6f, 1.0f, 1.0f), "LRF2"); LaserRangeFinderEntity vehicleLRF3 = CreateLRF(new Vector3 (0.0f, 0.5f, 1.2f), "LRF3"); LaserRangeFinderEntity vehicleLRF4 = CreateLRF(new Vector3(0.1f, 1.5f, 0.0f), "LRF4" ); LaserRangeFinderEntity vehicleLRF5 = CreateLRF(new Vector3(0.1f, 1.5f, 0.0f), "LRF5" ); // vehicleLRF1 .RaycastProperties.Range = 45; // vehicleLRF1.RaycastProperties.EndAngle = 180; // vehicleLRF1.RaycastProperties.StartAngle = 90; CarBaseEntity.InsertEntity(vehicleLRF1); CarBaseEntity.InsertEntity(vehicleLRF2); CarBaseEntity.InsertEntity(vehicleLRF3); CarBaseEntity.InsertEntity(vehicleLRF4); CarBaseEntity.InsertEntity(vehicleLRF5); The obstacle distance data from each of the sensors can be accessed and send over a port by creating a noifications port and then creating a listerner which is activated whenever the sensor is ready throw a new set of readings. Shown below is code for accessing and broadcasting readings from one of the sensors Port < RaycastResult> notifi cationTarget = new Port < RaycastResult>(); vehicleLRF1.Register(notificationTarget); Activate(Arbiter .Receive( true /* true means receiver is persistent */ notificationTarget, raycastResults => { LogInfo("impact point type" + raycastResults.GetType()); RaycastImpactPoint min=new RaycastImpactPoint (); min.Position.W=20000; foreach ( RaycastImpactPoint pt in raycastResults.ImpactPoints) { if (pt.Position.W < min.Position.W) min.Position.W = pt.Position.W; } LogInfo("closest point on L1 is : {0}mm away" + min.Position.W); // do something with raycastResults here }));

PAGE 37

The JA US Interface partial class MyCarService { JausComponent _component = new JausComponent ( EServiceType.GLOBAL_POSE_SENSOR); static int _jausPort = 4700; static int _intPort = 4800; void StartComponent( ) { // services List < MessageInfo > inputMsgs = new List< MessageInfo>(); List < MessageInfo > outputMsgs = new List< MessageInfo>(); outputMsgs.Add(new MessageInfo(( int ) EJausMessageType.REPORT_GLOBAL_POSE, uint .MaxValue, 30)); _component.ServiceConnectionManager.RegisterSupportedServices( new ServiceInfo( EServiceType.GLOBAL_POSE_SENSOR, inputMsgs, outputMsgs)); // operating rate _component.ServiceConnectionManager.Operati ngRate = 30; _component.ServiceConnectionManager.OutboundMessage = new JausMessage( new JausHeader() { Destination = new JausAddress(), Source = new JausAddress(), CommandCode=( int ) EJausMessageType.REPORT_GLOBAL_POSE }, new ReportGlobalPose ()); // state change event _compon ent.OnStateChange += new JAUS.NMI.NMI StateChangedCallback(_component_OnStateChange);

PAGE 38

_component.OnConnectionStatusUpdate += new JAUS.NMI.NMI ConnectionStatusUpdateCallback (_component_OnConnectionStatusUpda te); // attach message processor _component.JMH.AddMessageProcessor(this); // initialization _component.InitNMI(_jausPort, _intPort); // state transition _component.SetState(EComponentState.STARTUP); } void _component_OnConnectionStatusUpdate( bool connected) { if (connected) _component.SetState(EComponentState.READY); } void _component_OnStateChange(EComponentState state) { switch(s tate) { case EComponentState.STARTUP: _component.CheckIn(( int ) EServiceType.GLOBAL_POSE_SENSOR, _jausPort); Console.WriteLine("Checking in." ); break ; cas e EComponentState.READY: _updateTimer.Start(); Console.WriteLine("Component is Ready."); break ; case EComponentState.SHUTDOWN: _component.CheckOut(); Console.WriteLine("Checking out."); break ; default: break ; } } void UpdateGPose( Pose pose) { ReportGlobalPose gPose = new ReportGlobalPose(); gPose.PresenceVector.Vector = uint .MaxValue; gPose.Latitude = pose.Position.X; gPose.Longitude = pose.Position.Z; gPose.Elevation = pose.Position.Y; JausMessage msg = _component.ServiceConnectionManager.OutboundMessage; lock (msg) { msg.Contents = gPose; }

PAGE 39

} [ MessageProcessor ((int ) EJausMessageType.CREATE_SERVICE_CONNECTION)] void TestInput(JausMessage msg) { CreateServiceConnection svc = msg.Contents as CreateServiceConnection; Console .WriteLine("Service connection request: {0}", svc.CommandCode); } } JausComponent _component = new JausComponent ( EServiceType .GLOBAL_POSE_SENSOR); static int _jausPort = 4700; static int _intPort = 4800; Terrain M odeling

PAGE 40

#region Environment Entities void AddSky() { // Add a sky using a static texture. We will use the sky texture // to do per pixel lighting on each simulation visual entity SkyEntity sky = new SkyEntity( "sky.dds", "sky_diff.dds"); SimulationEngine .GlobalInstancePort.Insert(sky); // Add a directional light to simu late the sun. LightSourceEntity sun = new LightSourceEntity(); sun.State.Name = "Sun" ; sun.Type = LightSourceEntityType.Directional; sun.Color = new Vector4(0.8f, 0.8f, 0.8f, 1); sun.Direction = n ew Vector3(1.0f, 1.0f, 0.5f); SimulationEngine .GlobalInstancePort.Insert(sun); } private void AddTerrain() { TerrainEntity ground = new TerrainEntity ( "victorville5_30percent.bmp" //"terrain.bmp", //"jumpTerrain.bmp", //"wood_cherry.bmp", //"woodfloor.jpg",//"terrain_tex.jpg", "victorville31.jpg" new MaterialProperties ( "ground" 0.2f, // restitution 0.9f, // dynamic friction 0.9f) // static friction ); //ground.MeshScale=new Vector3(1.0f, 0.005f, 1.0f); //ground. SimulationEngine .GlobalInstancePort.Insert(ground); } #endregion Car M odeling Ackerman Steering

PAGE 41

#region Ackerman Steering convert double d = CarBaseEntity.WheelBase; double b = CarBaseEntity.DistanceBetweenWheels; double MAXANGLE = 70; double A2 = 0; if (m.Body._steeringAngle > 0) { A2 = d / (d / Math.Tan(m.Body._steeringAngle MAXANGLE (float ) Math .PI / 180.0f) + b); A2 = Math.Atan(A2); A2 /= MAXANGLE (float ) Math .PI / 180.0f; leftAngle = m.Body._steeringAngle; rightAngle = A2; } else if (m.Body._steeringAngle < 0) { A2 = d / (d / Math.Tan( m.Body._steeringAngle MAXANGLE (float ) Math .PI / 180.0f) + b); A2 = Math .Atan(A2); A2 /= MAXANGLE (float ) Math .PI / 180.0f; leftAngle = A2; rightAngle = m.Body._steeringAngle; } #endregion Wheel E ntity

PAGE 42

Friction float LoAsSlip = 0.04f; float LoExSlip = 0.01f; float LoExValue = 0.01f; float LoAsValue = 0.06f; float LoStiffness = 50000f; float LaStiffness = 10000f; // front left wheel WheelShapeProperties w = new WheelShapeProperties ( "front left wheel", FrontWheelMass, FrontWheelRadius); w.TireLateralForceFunction = new TireForceFunctionDescription(); w.Ti reLateralForceFunction.AsymptoteSlip = LoAsSlip; w.TireLateralForceFunction.AsymptoteValue = LoAsValue; w.TireLateralForceFunction.ExtremumSlip = LoExSlip; w.TireLateralForceFunction.ExtremumValue = LoExValue; w.TireLateralForceFunction.Stiffne ssFactor = LaStiffness; #region CAR_PA Robot entity void AddVehicle(Vector3 CarPosition) { CarBaseEntity = new SimpleFourByFour(CarPosition); CarBaseEntity.State.Name = "Navigator"; CarBaseEntity.FrontWheelMesh = "4x4wheel.obj"; CarBaseEntity.RearWheelMesh = "4x4wheel.obj"; LogInfo("Car created!!!"); CarBaseEntity.State.Velocity = new Vector3(1, 1, 5); CarBaseEntity.State.AngularVelocity = new Vector3 (0.0f, 0.0f, 0.0f); CarBaseEntity.SuspensionTravel = 0.127f; CarBaseEntity.SuspensionTravel = 0.5f; //off road CarBaseEntity.State.As sets.Mesh = "4x4Body.obj"; CarBaseEntity.MotorTorqueScaling = 60.0f; CarBaseEntity.DistanceBetweenWheels = 3.5f; CarBaseEntity.WheelBase = 4.0f; //The Laser range finders are inserted as children of the car vehicle LaserRangeFinderEntity vehicleLRF1 = CreateLRF(new Vector3 (0.6f, 1.0f, 1.0f), "LRF1"); LaserRangeFinderEntity vehicleLRF2 = CreateLRF(new Vector3(0.6f, 1.0f, 1.0f), "LRF2");

PAGE 43

LaserRangeFinderEntity vehicleLRF3 = CreateLRF(new Vector3(0.0f, 0.5f, 1.2f), "LRF3"); LaserRangeFinderEntity vehicleLRF4 = CreateLRF(new Vector3(0.1f, 1.5f, 0.0f), "LRF4" ); LaserRangeFinderEntity vehicleLRF5 = CreateLRF(new Vector3(0.1f, 1.5f, 0.0f), "LRF5" ); // vehic leLRF1.RaycastProperties.Range = 45; // vehicleLRF1.RaycastProperties.EndAngle = 180; // vehicleLRF1.RaycastProperties.StartAngle = 90; //some points are added around the car to provide a non uniform weight distribution similar to the actual car and also provide physical support SingleShapeEntity weight1=AddWeight("bigWeight", new Vector3 (0,0f,0),2); SingleShapeEntity wheelieBar = AddWeight("Wheelie Bar", new Vector3 (0f, 0.3f, 2.0f), 1); SingleShapeEntity noseWheelieBar = AddWeight("Nose Wheelier Bar", new Vector3(0f, 0.3f, 2.0f), 1); SingleShapeEntity leftBar = AddWeight("Left topple bar", new Vector3 (2.0f, 0.3f, 0.0f), 1); SingleShapeEntity rightBar = AddWeight("Right topple bar", new Vector3 (2.0f, 0.3f, 0.0f), 1); //each componenet of the vehicle is added as childeren of the car entity CarBaseEntity.InsertEntity(weight1); CarBaseEntity.InsertEntity(wheelieBar); CarBaseEntity.InsertEntity(noseWheelieBar); CarBaseEntity.InsertEntity(leftBar); CarBaseEntity.InsertEntity(rightBar); CarBaseEntity.InsertEntity(vehicleLRF1); CarBaseEntity.InsertEntity(vehicleLRF2); CarBaseEntity.InsertEntity(vehicleLRF3); CarBaseEntity.InsertEntity(vehicleLRF4); CarBaseEntity.InsertEntity(vehicleLRF5); // CarBaseEntity Port < RaycastResult> notificationTarget = new Port < RaycastResult>(); vehicleLRF1.Register(notificationTarget); Activate(Arbiter .Receive( true /* true means receiver is persistent */ notificationTarget, raycastResults => { LogInfo("impact point type" + raycastResults.GetType()); RaycastImpactPoint min=new RaycastImpactPoint (); min.Position.W=20000; foreach ( RaycastImpactPoint pt in raycastResults.ImpactPoints) { if (pt.P osition.W < min.Position.W) min.Position.W = pt.Position.W; }

PAGE 44

LogInfo("closest point on L1 is : {0}mm away" + min.Position.W); // do something with raycastResults here })); // creat e Camera Entity ans start SimulatedWebcam service CameraEntity camera = CreateCamera(); // insert as child of motor base CarBaseEntity.InsertEntity(camera); CameraSprite camSprite = new CameraSprite(8.0f, 6.0f, SpritePivotType.Center, 0, new Vector3(0, 5, 0)); camSprite.State.Name = "NaviGator camSprite"; camSprite.Flags |= VisualEntityProperties.DisableBackfaceCulling; camera.InsertEntity(camSprite); SimulationEngine .GlobalInstancePort.Insert(CarBaseEntity); // Start simulated arcos motor service //CarBaseEntity.SetWheelAngles(0.45f, 0.45f, 0, 0); Activate(Arbiter .Receive< Drive>( true, _mainPort, m => {CarBaseEntity.SetMotorTorque(m.Body._power, m.Body._power); double leftAngle = m.Body._steeringAngle, rightAngle = m.Body._steeringAngle; #region Ackerman Steering convert double d = CarBaseEntity.WheelBase; double b = CarBaseEntity.DistanceBetweenWheels; double MAXANGLE = 70; double A2 = 0; if (m.Body._steeringAngle > 0) { A2 = d / (d / Math.Tan(m.Body._steeringAngle MAXANGLE (float ) Math .PI / 180.0f) + b); A2 = Math.Atan(A2); A2 /= MAXANGLE (float ) Math .PI / 180.0f; leftAngle = m.Body._st eeringAngle; rightAngle = A2; } else if (m.Body._steeringAngle < 0) { A2 = d / (d / Math.Tan( m.Body._steeringAngle MAXANGLE (float ) Math .PI / 180.0f) + b); A2 = Math .Atan(A2); A2 /= MAXANGLE (float ) Math .PI / 180.0f; leftAngle = A2; rightAngle = m.Body._steeringAngle; } #endregion CarBaseEntity.SetWheelAngles((float )leftAngle, ( float)rightAngle, 0, 0); })); /* #region Simulated LRF distance accessing Port notificationTarget1 = new Port();

PAGE 45

vehicleLRF1.Register(notificationTarget1); Activate(Arbiter.Receive(true notificationTarget1, raycastResults => { // do something with raycastResults here //raycastResults.ImpactPoints.ToString(); System.Console.WriteLine("LaserData: {0}", raycastResults.ImpactPoints.ToString()); })); #endregion */ }

PAGE 49

Results

PAGE 50

Summary of R esults Future Work