MS in Data Science capstone project. Studying attacks on autonomous vehicles.

Overview

Surveying Attack Models for CAVs

Guide to Installing CARLA and Collecting Data

Our project focuses on surveying attack models for Connveced Autonomous Vehicles (CAVs). The primary tool we will be using throughout the project is CARLA, a vehicle simulation platform. This document serves a guide to how to get CARLA running on your system as well as show off a script we adapted from orginal CARLA install. This closely follows the quick installation from the CARLA leaderboard challenge.

Requirements

CARLA runs best for Ubuntu 18.04 and for Windows. The following requirements are for an Ubuntu system:

  • Python 3
  • Anaconda
  • pip installer
  • 6 GB GPU
  • 20 GB of disk space

Installation

  1. Download the CARLA 0.9.10.1 release found here and unzip the package into a folder named CARLA.

  2. Download leaderboard repo:

    git clone -b stable --single-branch https://github.com/carla-simulator/leaderboard.git

  3. Switch to your CARLA root directory and run the following:

    pip3 install -r requirements.txt

  4. Clone the scenario runner repo

    git clone -b leaderboard --single-branch https://github.com/carla-simulator/scenario_runner.git

  5. cd into the scenario runner directory and run the following:

    pip3 install -r requirements.txt

  6. Now the environment variables need to be defined. Open a fresh terminal and open the bash profile:

    gedit ~/.bashrc

  7. Add the following to the bash profile:

export CARLA_ROOT=PATH_TO_CARLA_ROOT export SCENARIO_RUNNER_ROOT=PATH_TO_SCENARIO_RUNNER export LEADERBOARD_ROOT=PATH_TO_LEADERBOARD export PYTHONPATH="${CARLA_ROOT}/PythonAPI/carla/":"${SCENARIO_RUNNER_ROOT}":"${LEADERBOARD_ROOT}":"${CARLA_ROOT}/PythonAPI/carla/dist/carla-0.9.10-py3.7-linux-x86_64.egg":${PYTHONPATH}

  1. source so the changes can take place:

    source ~/.bashrc

Running CARLA

To run CARLA simply open the terminal, change into your CARLA directory and run the following:

./CarlaUE4.sh

This should open an environment that looks like this:

Collecting Data

For our project, we are primary concerned with collecting LiDAR, Radar, Camera and GPS data. Thus far, we have a basic test python script that will log the latitude and longitude of an vehicle placed in the environment.

The test script can be found under tutorials/alana_test.py. This script should be placed under your CARLA root directory in PythonAPI.

The output will be .csv, a sample of which can be found under tutorials/outputgnss.csv. It should look like:

time latitude longitude altitude
8.076389706286136 0.001497075674478765 0.0013884266232332388 2.5777945518493652
12.463837534713093 0.0015216552946100137 0.0013888545621243858 2.003244638442993
16.73477230645949 0.0015536632594717048 0.0013894208066981615 1.2948381900787354
21.952862125413958 0.001598166573884896 0.0013902203478743502 0.5753694176673889
27.885887853393797 0.001605972963332647 0.001390361257925631 0.47810444235801697
Owner
Isabela Caetano
Masters student studying data science
Isabela Caetano
A data parser for the internal syncing data format used by Fog of World.

A data parser for the internal syncing data format used by Fog of World. The parser is not designed to be a well-coded library with good performance, it is more like a demo for showing the data struc

Zed(Zijun) Chen 40 Dec 12, 2022
[CVPR2022] This repository contains code for the paper "Nested Collaborative Learning for Long-Tailed Visual Recognition", published at CVPR 2022

Nested Collaborative Learning for Long-Tailed Visual Recognition This repository is the official PyTorch implementation of the paper in CVPR 2022: Nes

Jun Li 65 Dec 09, 2022
A program that uses an API and a AI model to get info of sotcks

Stock-Market-AI-Analysis I dont mind anyone using this code but please give me credit A program that uses an API and a AI model to get info of stocks

1 Dec 17, 2021
An experimental project I'm undertaking for the sole purpose of increasing my Python knowledge

5ePy is an experimental project I'm undertaking for the sole purpose of increasing my Python knowledge. #Goals Goal: Create a working, albeit lightwei

Hayden Covington 1 Nov 24, 2021
Open source platform for Data Science Management automation

Hydrosphere examples This repo contains demo scenarios and pre-trained models to show Hydrosphere capabilities. Data and artifacts management Some mod

hydrosphere.io 6 Aug 10, 2021
:truck: Agile Data Preparation Workflows madeΒ easy with dask, cudf, dask_cudf and pyspark

To launch a live notebook server to test optimus using binder or Colab, click on one of the following badges: Optimus is the missing framework to prof

Iron 1.3k Dec 30, 2022
A fast, flexible, and performant feature selection package for python.

linselect A fast, flexible, and performant feature selection package for python. Package in a nutshell It's built on stepwise linear regression When p

88 Dec 06, 2022
cLoops2: full stack analysis tool for chromatin interactions

cLoops2: full stack analysis tool for chromatin interactions Introduction cLoops2 is an extension of our previous work, cLoops. From loop-calling base

YaqiangCao 25 Dec 14, 2022
Very useful and necessary functions that simplify working with data

Additional-function-for-pandas Very useful and necessary functions that simplify working with data random_fill_nan(module_name, nan) - Replaces all sp

Alexander Goldian 2 Dec 02, 2021
Big Data & Cloud Computing for Oceanography

DS2 Class 2022, Big Data & Cloud Computing for Oceanography Home of the 2022 ISblue Big Data & Cloud Computing for Oceanography class (IMT-A, ENSTA, I

Ocean's Big Data Mining 5 Mar 19, 2022
Demonstrate a Dataflow pipeline that saves data from an API into BigQuery table

Overview dataflow-mvp provides a basic example pipeline that pulls data from an API and writes it to a BigQuery table using GCP's Dataflow (i.e., Apac

Chris Carbonell 1 Dec 03, 2021
Project under the certification "Data Analysis with Python" on FreeCodeCamp

Sea Level Predictor Assignment You will anaylize a dataset of the global average sea level change since 1880. You will use the data to predict the sea

Bhavya Gopal 3 Jan 31, 2022
Parses data out of your Google Takeout (History, Activity, Youtube, Locations, etc...)

google_takeout_parser parses both the Historical HTML and new JSON format for Google Takeouts caches individual takeout results behind cachew merge mu

Sean Breckenridge 27 Dec 28, 2022
A DSL for data-driven computational pipelines

"Dataflow variables are spectacularly expressive in concurrent programming" Henri E. Bal , Jennifer G. Steiner , Andrew S. Tanenbaum Quick overview Ne

1.9k Jan 03, 2023
International Space Station data with Python research 🌎

International Space Station data with Python research 🌎 Plotting ISS trajectory, calculating the velocity over the earth and more. Plotting trajector

Facundo Pedaccio 41 Jun 16, 2022
Detailed analysis on fraud claims in insurance companies, gives you information as to why huge loss take place in insurance companies

Insurance-Fraud-Claims Detailed analysis on fraud claims in insurance companies, gives you information as to why huge loss take place in insurance com

1 Jan 27, 2022
Udacity - Data Analyst Nanodegree - Project 4 - Wrangle and Analyze Data

WeRateDogs Twitter Data from 2015 to 2017 Udacity - Data Analyst Nanodegree - Project 4 - Wrangle and Analyze Data Table of Contents Introduction Proj

Keenan Cooper 1 Jan 12, 2022
MotorcycleParts DataAnalysis python

We work with the accounting department of a company that sells motorcycle parts. The company operates three warehouses in a large metropolitan area.

NASEEM A P 1 Jan 12, 2022
TheMachineScraper πŸ±β€πŸ‘€ is an Information Grabber built for Machine Analysis

TheMachineScraper πŸ±β€πŸ‘€ is a tool made purely for analysing machine data for any reason.

doop 5 Dec 01, 2022