What Are Good Deep Learning Papers for Self Driving Cars?

C O N T E N T S:

KEY TOPICS

  • A new approach called BlockDrop, that’s described in the paper ” BlockDrop: Dynamic Inference Paths in Residual Networks ” enables deep learning algorithms to essentially drop layers of the neural network conditioned on the input, allowing the system to allocate resources more efficiently and accurately identify an image.(More…)
  • The 2018 Audi A8 Luxury Sedan was the first commercial car to claim to be capable of level 3 self driving.(More…)
  • Boston Director, Product Strategy Who we are: Neurala, Inc. (www.neurala.com) is a software company located in Seaport District of Boston, MA that developed The Neurala Brain– Deep Learning neural network software that makes smart products like inspection cameras, robots, drones, toys, consumer electronics and self-driving cars more autonomous, engaging and useful.(More…)
  • I predict it will be like that with AI. Each fatality caused by a self driving car will cut the number of VC’s likely to invest in AI by half.(More…)

POSSIBLY USEFUL

  • Pytorch easy-to-follow step-by-step Deep Q Learning tutorial with clean readable code.(More…)

RANKED SELECTED SOURCES

What Are Good Deep Learning Papers for Self Driving Cars?
Image Courtesy:
link: http://selfdrivingcars.mit.edu/resources/
author: Deep Learning for Self-Driving Cars – MIT

KEY TOPICS

A new approach called BlockDrop, that’s described in the paper ” BlockDrop: Dynamic Inference Paths in Residual Networks ” enables deep learning algorithms to essentially drop layers of the neural network conditioned on the input, allowing the system to allocate resources more efficiently and accurately identify an image. [1] After segmentation( Source ) Before the deep learning influence in computer vision, other machine learning approaches such as Random forest where used to do the segmentation. [2] A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in python using Scikit-Learn and TensorFlow. [3] Master the essential skills needed to recognize and solve complex real-world problems with Machine Learning and Deep Learning by leveraging the highly popular Python Machine Learning Eco-system. [3]

This project makes use of deep learning concepts that was learned through first term of the Udacity’s nanodegree. [2] A curated list of awesome Deep Learning tutorials, projects and communities. [3] New innovations like these rethink the way today’s deep learning systems can be designed in order to make them a practical reality for current and future applications. [1] Data science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines. [3] Keras code and weights files for popular deep learning models. [3]

In a widely read article published early this year on arXiv.org, a site for scientific papers, Gary Marcus, a professor at New York University, posed the question: “Is deep learning approaching a wall?” He wrote, “As is so often the case, the patterns extracted by deep learning are more superficial than they initially appear.” [4] Intel is also developing deep neural learning units for fleets and eventually for complete autonomous driving systems. (5) The company wants to perfect the fusion component in which parallel and sequential signals are merged, read and deciphered on the way to semantic understanding. [5] Its A.I. technology learned from relatively few examples to mimic human visual intelligence, using data 300 times more efficiently than deep learning models. [4] The AI systems we have today rely on deep learning and, thus, tend to require large amounts of labeled data–that data is used to train large models that require a lot of computational resources. [6] AI is all about machine learning, and machine learning is all about deep learning (DL), according to the hype. [7] Deep learning is a specific machine learning technique, and its success in a variety of domains has led to the renewed interest in AI. [6] Deep learning has been at the forefront of the so-called AI revolution for years now, and many people believed that it would take us to the world of the technological singularity. [8] VentureBeat meta name”description” content”Guest Deep learning has been at the forefront of the AI revolution for years. [8]

By far the biggest blow to deep learning fame is the domain of self-driving vehicles. [8] The premise of deep learning is that that is all the machine needs to learn how to correctly distinguish between them. [9] With Volta, we reinvented the GPU. Its revolutionary Tensor Core architecture enables multi-precision computing — cranking through deep learning matrix operations at 125 teraflops at FP16 precision, and using FP64 and FP32 when there’s a need for greater range, precision or numerical stability. [10] For the past few years, Kumar has been leading an effort to use GPU-powered deep learning to more accurately diagnose cancers sooner using ultrasound images. [11] Here’s hoping that the company takes a look at its use of deep learning with a cold, hard eye — and puts the intelligence back into A.I. [9] They often use deep learning as one ingredient among others in their recipe. [4] Kyndi and others are betting that the time is finally right to take on some of the more daunting challenges in A.I. That echoes the trajectory of deep learning, which made little progress for decades before the recent explosion of digital data and ever-faster computers fueled leaps in performance of its so-called neural networks. [4] Kyndi’s office in San Mateo, Calif. The company’s focus on the reasoning side of artificial intelligence distinguishes it from the branch known as deep learning, in which computers train themselves by processing massive amounts of data. [4] As The New York Times notes in its article ” Is There a Smarter Path to Artificial Intelligence? Some Experts Hope So,” “Some scientists are asking whether deep learning is really so deep after all.” [9] For the past five years, the hottest thing in artificial intelligence has been a branch known as deep learning. [4] If the reach of deep learning is limited, too much money and too many fine minds may now be devoted to it, said Oren Etzioni, chief executive of the Allen Institute for Artificial Intelligence. [4] Deep learning comes from the statistical side of A.I. known as machine learning. [4] As groundbreaking as his work is, Kumar urges patience when it comes to applying AI and deep learning in the medical field. [11] In my opinion, signs already show a huge decline in deep learning (and probably in AI in general as this term has been abused ad nauseam), yet hidden from the majority by an increasingly intense narrative. [8] Much of the current media coverage about AI revolves around deep learning. [6] Recent prominent examples of AI systems–systems that excelled at Go and Poker–used deep learning and other methods. [6] Companies like Google, Facebook and Microsoft have poured money into deep learning. [4] While deep learning software can instantly identify millions of words, it has no understanding of a concept like “justice,” “democracy” or “meddling.” [4] While that program and other efforts vary, their common goal is a broader and more flexible intelligence than deep learning. [4] Let me begin by citing a recent survey we conducted: among other things, we found that a majority (54%) consider deep learning an important part of their future projects. [6] He dove in, spending more than six months teaching himself everything he could about building and working with deep learning models. [11] Aside from deep learning, reinforcement learning (RL) stands out as a topic gaining interest among companies. [6] Kumar’s team does its local processing using the TensorFlow deep learning framework container from NVIDIA GPU Cloud (NGC) on NVIDIA TITAN and GeForce GPUs. [11] It used to take two or three days for him to configure a system for deep learning, and now takes as little as a couple of hours. [11]

Autonomous cars are being developed with deep neural networks, a type of deep learning architecture with many computational stages, or levels, in which neurons are simulated from the environment that activate the network. [12] “An Empirical Evaluation of Deep Learning on Highway Driving”. arXiv : 1504.01716 ?. [12] Deep learning has instead given us machines with truly impressive abilities but no intelligence.” [13] AlphaGo is an artificial intelligence (AI) program built using deep learning technologies. [14] Although deep learning, a branch of artificial intelligence, has become prominent only recently, it is based on concepts that are familiar to chemical engineers. [14] Deep learning is a subfield of ML that uses algorithms called artificial neural networks (ANNs), which are inspired by the structure and function of the brain and are capable of self-learning. [14] This article describes artificial neural networks — the algorithms that enable deep learning. [14] Leading critics of Deep Learning such as Gary Marcus are amused, but also disturbed, by the notion that Deep Learning and its generalizations such as Differentiable Programming are really setting us on a path toward true Artificial General Intelligence. [15] Leading proponents of contemporary Deep Learning think it still has a chance to broaden and expand the current horizons, eventually leading to the same original goal for all, which is Artificial General Intelligence. [15]

RNNs are used in deep learning and in the development of models that simulate the activity of neurons in the human brain. [16] Outside academia, deep learning is already being used by practicing engineers to solve a whole range of previously intractable problems and may become as valuable as Excel to chemical engineers in the future. [14] This whitepaper made the case for deep learning as a foundational technology that should transform not only technology companies but every sector of the global economy. [17]

The 2018 Audi A8 Luxury Sedan was the first commercial car to claim to be capable of level 3 self driving. [12] Self driving cars will be able to accelerate and brake more efficiently, meaning higher fuel economy from reducing wasted energy typically associated with inefficient changes to speed (energy typically lost due to friction, in the form of heat and sound). [12]

Boston Director, Product Strategy Who we are: Neurala, Inc. (www.neurala.com) is a software company located in Seaport District of Boston, MA that developed The Neurala Brain– Deep Learning neural network software that makes smart products like inspection cameras, robots, drones, toys, consumer electronics and self-driving cars more autonomous, engaging and useful. [18] They close with discussions about Gary Marcus’s recent article, which offers a critical appraisal of Deep Learning, and a recent paper that suggests that convolutional neural nets may not be as good at “grasping” higher-level abstract concepts as is typically believed. [19] Deep Learning (DL) has become more than just a buzzword in the Artificial Intelligence (AI) community – it is reshaping global business through the prolific use of autonomous, self-teaching systems, which can build models by directly studying images, text, audio, or video data. [20] According to many technical professionals, businesses can reap the full benefits of AI only when the appropriate levels of competency is developed in advanced data technologies such as Machine Learning (ML) and Deep Learning for extracting reliable business insights. [20] Two major factors that differentiate Deep Learning from other AI technologies: Largeness of training data and direct analysis of unstructured data. [20] Azure Batch AI is a service that helps users provision and manage clusters of virtual machines for deep learning training jobs. [21] The ultimate push came from the four giants in the IT industry (Facebook, Google, Microsoft, and IBM) who were all out to win the AI in the enterprise race by leveraging their Deep Learning technology development strategy. [20] You will learn how to perform distributed deep learning on Azure, and how you can do this using Horovod running on Azure Batch AI. [21] In this blog I present my thoughts on how PVM relates to deep learning and the global AI landscape. [22] The Computer World article Deep Learning Use Cases for ASEAN describes how DL algorithms can be used to aid traffic management in ASEAN member countries. [20] Deep Learning algorithms are becoming more widely used in every industry sector from online retail to photography; some use cases are more popular and have attracted extra attention of global media than others. [20] Deep Learning use cases have been widely used for knowledge discovery and Predictive Analytics. [20] A key consideration in distributed deep learning is how to efficiently use the resources that are available (CPUs, GPUs, and more). [21] The Deep Learning revolution began from a need to build “high-accuracy predictive models” from unstructured data such as images, voice, and natural language. [20] One of the primary drivers of Deep Learning is that it can crunch much more data at very high speeds. DL techniques have become necessary for successful pattern recognition in large unstructured data. [20] It’s quite clear that large or small companies alike are making heavy investments into Deep Learning technologies, as they all think such advances will be core drivers of enterprise growth far into the future. [20] Deep Learning has pervaded the global business landscape, capturing the undivided attention of industry giants like IBM, Facebook, Google, Microsoft, Twitter, PayPal, or Yahoo, among others. [20] Deep Learning takes this learning process one step ahead by directly working with images, audio, or video data without the data going through any kind of initial preparation. [20] Atomwise : Another startup applies Deep Learning technology to drug discovery. [20] According to recent Tractica report on Deep Learning, the DL software market will expand from “$655 million in 2016 to $34.9 billion worldwide by 2025.” [20] The NVIDIA DeepStream Software Development Kit (SDK) was originally released in 2017 to simplify the deployment of scalable intelligent video analytics (IVA) powered by deep learning. [23] This blog will show how you can train an object detection model by distributing deep learning training to multiple GPUs. [21] In this blog post, we showed you how to do distributed deep learning using Horovod on Azure. [21] Below we briefly discuss the several ways distributed training can be accomplished, and introduce Horovod, a distributed deep learning framework that can be used with TensorFlow, Keras and PyTorch. [21] One answer to this processing need has been provided by NVIDIA’s GPU system and open source Deep Learning libraries. [20] In this lesson, we?ll cover topics in Deep Learning including Convolutional Neural Networks. [24] In this term, you?ll cover topics in deep learning and reinforcement learning. [24] Over the past few years, many exciting deep learning approaches for object detection have emerged. [21]

I predict it will be like that with AI. Each fatality caused by a self driving car will cut the number of VC’s likely to invest in AI by half. [22]

POSSIBLY USEFUL

Pytorch easy-to-follow step-by-step Deep Q Learning tutorial with clean readable code. [3] A set of Deep Reinforcement Learning Agents implemented in Tensorflow. [3]

Including Natural Language Processing and Computer Vision projects, such as text generation, machine translation, deep convolution GAN and other actual combat code. [3] While such systems have proven their ability to accurately identify images and multimedia with minimal human intervention, in reality, the deep learning-based methods used today take too long to train, consume far too much power, and may harvest vast amounts of data unnecessarily. [1] In October 2016, Google announced Google Home –its competitor to Amazon Echo that features deep integration with other Google products, like YouTube, Google Play Music, Nest, and Google Assistant. [25]

The nature of how Machine Learning is being undertaken today for AI self-driving. [26] To simplify the discussion, think of AI as the broader goal of autonomous machine intelligence, and machine learning as the specific scientific methods currently in vogue for building AI. All machine learning is AI, but not all AI is machine learning. [25] At the Cybernetic Self-Driving Car Institute, we are developing AI systems for self-driving cars that use Machine Learning and also aid firms in the assessing their ML systems and improving their ML systems. [27] I call upon my fellow AI self-driving car industry colleagues to find a means to add to the MLPerf with Machine Learning models and datasets that are specific to self-driving cars. [27]

In the future, AI will shorten your commute even further via self-driving cars that result in up to 90% fewer accidents, more efficient ride sharing to reduce the number of cars on the road by up to 75%, and smart traffic lights that reduce wait times by 40% and overall travel time by 26% in a pilot study. [25]

How do they determine the price of your ride? How do they minimize the wait time once you hail a car? How do these services optimally match you with other passengers to minimize detours? The answer to all these questions is ML. [25] The timeline for some of these changes is unclear, as predictions vary about when self-driving cars will become a reality: BI Intelligence predicts fully-autonomous vehicles will debut in 2019; Uber CEO Travis Kalanick says the timeline for self-driving cars is “a years thing, not a decades thing”; Andrew Ng, Chief Scientist at Baidu and Stanford faculty member, predicted in early 2016 that self-driving cars will be mass produced by 2021. [25] There is image classification, which is an important part of the sensor data analysis on an AI self-driving car. [27] Object detection is another important aspect of an AI self-driving car such as finding a pedestrian in an image of a street scene. [27] The existing MLPerf does not directly address AI self-driving cars per se, though it does touch upon it. [27] Dr. Lance Eliot, CEO, Techbrium Inc. – techbrium.com – and is a regular contributor as our AI Trends Insider, where he has recently authored over 150 articles and has published 9 books on the future of driverless cars. [27]

For training purposes, they do allow for the “hyperparameters (e.g., batch size, learning rate) may be selected to best utilize the framework and system being tested.” [27] For those of you that are relatively new to Machine Learning, here’s a handy tip – you ought to know each of the above ML models, and you ought to know the datasets that are being used with those models. [27] If you are someone that relies upon an AI system using Machine Learning, this will be of benefit to you too. [27] We distinguish between AI and machine learning (ML) throughout this article when appropriate. [25]

Uber’s Head of Machine Learning Danny Lange confirmed Uber’s use of machine learning for ETAs for rides, estimated meal delivery times on UberEATS, computing optimal pickup locations, as well as for fraud detection. [25] Through the use of machine learning algorithms, Gmail successfully filters 99.9% of spam. [25] Smart reply uses machine learning to automatically suggest three different brief (but customized) responses to answer the email. [25] Instagram, which Facebook acquired in 2012, uses machine learning to identify the contextual meaning of emoji, which have been steadily replacing slang (for instance, a laughing emoji could replace “lol”). [25]

One MIT paper highlights the possibility of using machine learning to optimize this algorithm. [25] This technology is powered by the 2015 acquisition of Looksery (for a rumored $150 million), a Ukranian company with patents on using machine learning to track movements in video. [25]

Turi Create simplifies the development of custom machine learning models. [3] Machine learning is used for fraud prevention in online credit card transactions. [25] If you are determined to get a really high score on your new hardware or software for Machine Learning, you might be tempted to try and find loopholes in the MLPerf. [27] Gautam Narula is a machine learning enthusiast, computer science student at Georgia Tech, and published author. [25] At TechEmergence, we?ve developed concrete definitions of both artificial intelligence and machine learning based on a panel of expert feedback. [25] TechEmergence conducts direct interviews and consensus analysis with leading experts in machine learning and artificial intelligence. [25]

A single trip may involve multiple modes of transportation (i.e. driving to a train station, riding the train to the optimal stop, and then walking or using a ride-share service from that stop to the final destination), not to mention the expected and the unexpected: construction; accidents; road or track maintenance; and weather conditions can constrict traffic flow with little to no notice. [25] The Incredible PyTorch: a curated list of tutorials, papers, projects, communities and more relating to PyTorch. [3] Read Kumar’s paper, ” Automated and real-time segmentation of suspicious breast masses using convolutional neural network.” [11] In the same way that technology was used to try to keep drunk drivers from starting their cars, using alcohol detection breathers, software could deny entry or starting a car if the brakes are severely worn to a dangerous point, for example. [5] In discussing a plan to race identical autonomous cars under controlled conditions, the author notes the main problem: “Nevertheless, the biggest challenge self-driving cars will have to overcome on the road is being able to react to the randomness of traffic flow, other drivers, and the fact that no two driving situations are ever the same,” the report said. [5] It’s almost noon, and the streets of Moscow are busy with cars driving close to each other, bikes, and pedestrians. [7]

Car owners today are notorious for allowing dangerous conditions in their vehicles to fester, or questionable equipment to go unrepaired. [5] He switched lanes once, and applied the brakes the second time, although the car looked like it was slowing down. [7] During one of the tests, annoyed that the automobile was following every rule, someone told Polishchuk: “Your car drives like my mother.” [7] Between the lidar measurements, the car uses odometry, calculating how much the wheels have turned and in which direction, Polishchuk says. [7] Since local legislation does not allow unmanned cars on public roads, one of his colleagues, Alex, is sitting behind the wheel hoping not to have to touch it. [7] Identifying which tasks to automate is critical, as Tesla recently found out in their quest to automate many aspects of car manufacturing. [6]

With harsh winters, drivers who constantly switch lanes, traffic jams and occasional crashes, the Russian capital of Moscow provides a challenging setting for testing autonomous cars. [7] Here is another key to achieving a truly safe autonomous car: sharing data among all the companies working on this problem. [5] He dreams that one day, there will be a fleet of autonomous cars in Moscow built on top of Yandex.taxi, the joint-venture with Uber. [7] Graduates get to work on different kinds of projects, including the autonomous car and the new Yandex.Station, a device resembling the Amazon Echo. [7] The engineer says it’s a difficult scenario for an autonomous car because it involves human interaction: “A human driver waves his hand or shakes his head. [7]

Not on the surface yet the NIPS conference is still oversold, corporate PR still has AI all over its press releases, Elon Musk still keeps promising self-driving cars, and Google keeps pushing Andrew Ng’s line that AI is bigger than electricity. [8] By far the biggest pin punching through the AI bubble was the accident in which an Uber self-driving car killed a pedestrian in Arizona. [8]

Our own self-driving car teams use this to design simulation tests for our NVIDIA DRIVE platform using the method we described in ” High-Resolution Image Synthesis and Semantic Manipulation with Conditional GANs.” [10] Photorealistic simulation enables a safer, more scalable, and more cost-effective way to bring self-driving cars to our roads. [10] Tesla announced that its fully self-driving cars were very close, even selling that option to customers to be enabled later via a software update. [8]

I respect Marcus a lot; he behaves like a real scientist should, while most so-called “deep learning stars” just behave like cheap celebrities. [8] “Deep learning has given us a glimpse of the promised land, but we need to invest in other approaches,” said Dileep George, an A.I. expert and co-founder of Vicarious, which is based in Union City, Calif. [4]

The report uses this example of visual interpretation through cameras: “There is a massive amount of computation required to be able to take these pixels and figure out, “is that a truck?” or “is that a stationary cyclist?” or “in which direction does the road curve?” It’s this type of computer vision coupled with deep neural-network-processing that is required by self-driving cars.” [5] Deep neural networks like NVIDIA’s DRIVE? PX are now performing at a level of semantic decision making, although still far from autonomy. (4) The Parker AutoChauffer now offers driverless transport for defined point-to-point trips. [5]

The computer is the next step in learning situations, processing choices and creating vehicle movement (or shutdown) instantly. [5] Tesla said that this “shadow mode” learning offers more complete experiential/ reactive learning since it will constantly compare what the human driver does to what it was programmed to do in any situation. [5] Intelligence augmentation and intelligent infrastructure are inherently multidisciplinary, and require going beyond the perspective of a single agent learning to map inputs to outputs. [6]

Did you ever wonder what it’s like to build an AI personal assistant, or to bridge the language gap? Hint: There’s big data and machine learning involved. [7] Most companies are beginning to explore how to use machine learning and AI, and we wanted to give an overview and framework for how to think about these technologies and their roles in automation. [6] The reality is that many AI systems will use many different machine learning methods and techniques. [6] Along the way, we describe the machine learning and AI tools that can be used to enable automation. [6] First of all, what is called AI today is often really just machine learning. [6] Machine learning and AI will enable automation across many domains and professions. [6] Depending on the context, an AI system might be asked to solve different types of problems: reinforcement learning excels at problems that fall outside the realm of unsupervised and supervised machine learning. [6] Reinforcement learning has played a critical role in many prominent AI systems. [6]

One ongoing project by Tesla will allow its autonomous driving system to learn whether it is in charge of driving functions or not. (7) According to this report, Tesla’s AI system use a monitor mode to “learn” how decisions are made by a human driver, although it will not be in command of any functions. [5] As I predicted, the place where the cracks in AI are most visible is autonomous driving an actual application of the technology in the real world. [8]

This interesting proposition involving ethical dilemmas presented to smart machines is one subject of a study (1) about autonomous driving and emergencies. [5]

Or take the challenge of training autonomous vehicles to drive safely by training them on data from billions of miles of driving. [10] Humans make good and bad instant decisions; driving activities are inherently dangerous because people react differently to the exact same situation. [5] These mechanisms have evolved for a billion years to keep us safe, and driving context (although modern) makes use of many such reflexes. [8] That could take years of driving on public roads to collect. [10]

We will look at some broad safety/security challenges in non-human driving, then drill down to some specific top-level engineering being done today. [5]

Intel GO platforms combine three different processor types to produce a decision-making signal at the fusion level: “The compute required for autonomous driving can be divided into three intertwined stages: sense, fuse, and decide. [5]

The self-driving car industry has several levels with only the highest level (Level 5) representing full automation. [6] The technology’s perception and pattern-matching abilities are being applied to improve progress in fields such as drug discovery and self-driving cars. [4] On a computer screen that I’m not allowed to film, Polishchuk shows me how the self-driving car perceives the environment. [7] The Yandex self-driving car disengaged twice during our 20-minute ride through the city, obliging our human driver Alex to intervene. [7] The UK wants to show a very upbeat manufacturing and trade persona to Europe, and perhaps no other project is as hot on the world stage as driverless cars and trucks. [5] “In Moscow, the guys behind you honk the horn even before the traffic lights turn green,” says Dmitry Polishchuk, head of Yandex’s driverless car project. [7] To achieve a level 5 driverless car, when the steering wheel is optional, the Moscow-based company is constantly looking for talented people, and education is a big part of its HR strategy. [7] Shahin Farshchi examines role artificial intelligence will play in driverless cars. [6]

Watch highlights covering artificial intelligence, machine learning, intelligence engineering, and more. [6]

I anticipate more papers and articles describing compelling and very practical hybrid systems in the future. [6] Go deeper : In January, Gary Marcus, a New York University professor, ignited a firestorm in the field with a paper that catalogued doubts about machine learning, the most broadly practiced method of AI. Among the leading lights to deride him was Facebook’s Yann LeCun. [13] “Vehicle detection in driving simulation using extreme learning machine”. [12] In Robert A Heinlein’s novel, The Number of the Beast (1980), Zeb Carter’s driving and flying car “Gay Deceiver” is at first semi-autonomous and later, after modifications by Zeb’s wife Deety, becomes sentient and capable of fully autonomous operation. [12] The film I, Robot (2004), set in Chicago in 2035, features autonomous vehicles driving on highways, allowing the car to travel safer at higher speeds than if manually controlled. [12] The film, The Incredibles (2004), Mr. Incredible makes his car autonomous for him while it changes him into his supersuit when driving to save a cat from a tree. [12] The Audi A8 was claimed to be the first production car to reach level 3 autonomous driving, and Audi would be the first manufacturer to use laser scanners in addition to cameras and ultrasonic sensors for their system. [12] “As more autonomous driving becomes a reality and our cars begin to make more decisions, the amount of compute power required in these vehicles is growing at a rapid pace.” [28] By 2017, Mercedes has vastly expanded its autonomous driving features on production cars: In addition to the standard Distronic Plus features such as an active brake assist, Mercedes now includes a steering pilot, a parking pilot, a cross-traffic assist system, night-vision cameras with automated danger warnings and braking assist (in case animals or pedestrians are on the road for example), and various other autonomous-driving features. [12] The NHTSA’s preliminary evaluation was opened to examine the design and performance of any automated driving systems in use at the time of the crash, which involved a population of an estimated 25,000 Model S cars. [12] ” HOW DO DRIVERS BEHAVE IN A HIGHLY AUTOMATED CAR? ” Institute for Transport Studies University of Leeds. Quote: “Drivers’ response to all critical events was found to be much later in the automated driving condition, compared to manual driving.” [12] When activated by the human driver, the car takes full control of all aspects of driving in slow-moving traffic at up to 60 kilometres per hour (37mph). [12] Autonomous cars are predicted to increase traffic flow; provide enhanced mobility for children, the elderly, disabled, and the poor; relieve travelers from driving and navigation chores; lower fuel consumption; significantly reduce needs for parking space ; reduce crime; and facilitate business models for transportation as a service, especially via the sharing economy. [12] When autonomous cars shift the responsibility of driving from humans to autonomous car technology, there is a need for existing liability laws to evolve in order to fairly identify the appropriate remedies for damage and injury. [12] Increases in the use of autonomous car technologies (e.g. advanced driver-assistance systems ) is not only causing incremental shifts in this responsibility of driving but also reducing the frequency of on the road accidents. [12] Autonomous cars could reduce labor costs ; relieve travelers from driving and navigation chores, thereby replacing behind-the-wheel commuting hours with more time for leisure or work; and also would lift constraints on occupant ability to drive, distracted and texting while driving, intoxicated, prone to seizures, or otherwise impaired. [12]

“Self-driving cars for country roads: Today’s autonomous vehicles require hand-labeled 3-D maps, but CSAIL’s MapLite system enables navigation with just GPS and sensors”. [12] Autonomous vehicles could increase the overall number of cars on the road which could lead to a greater dependence on oil imports if smart systems are not enough to curtail the impact of more vehicles. [12] A question that comes into play that programmers find difficult to answer is “what decision should the car make that causes the “smallest? damage when it comes to people’s lives?” The ethics of autonomous vehicles is still in the process of being solved and could possibly lead to controversy. [12] A 2013 survey of 1,500 consumers across 10 countries by Cisco Systems found 57% “stated they would be likely to ride in a car controlled entirely by technology that does not require a human driver”, with Brazil, India and China the most willing to trust autonomous technology. [12]

According to Tesla, starting 19 October 2016, all Tesla cars are built with hardware to allow full self-driving capability at the highest safety level ( SAE Level 5 ). [12] Additional advantages could include higher speed limits ; smoother rides; and increased roadway capacity; and minimized traffic congestion, due to decreased need for safety gaps and higher speeds. Currently, maximum controlled-access highway throughput or capacity according to the U.S. Highway Capacity Manual is about 2,200 passenger vehicles per hour per lane, with about 5% of the available road space is taken up by cars. [12] Eight involved rear-end collisions at a stop sign or traffic light, two in which the vehicle was side-swiped by another driver, one in which another driver rolled through a stop sign, and one where a Google employee was controlling the car manually. [12] In July 2015, three Google employees suffered minor injuries when their vehicle was rear-ended by a car whose driver failed to brake at a traffic light. [12]

Outside of these areas or circumstances, the vehicle must be able to safely abort the trip, i.e. park the car, if the driver does not retake control. [12] The second known fatal accident involving a vehicle being driven by itself took place in Williston, Florida on 7 May 2016 while a Tesla Model S electric car was engaged in Autopilot mode. [12] In 2018, in a subsequent civil suit between the father of the driver killed and Tesla, Tesla did not deny that the car had been on Autopilot at the time of the accident, and sent evidence to the victim’s father documenting that fact. [12] On 9 January 2016, Tesla rolled out version 7.1 as an over-the-air update, adding a new “summon” feature that allows cars to self-park at parking locations without the driver in the car. [12] The trouble is that, apart from a few daredevils abusing (and sometimes crashing) Teslas, there has been little indication to date that a significant number of the world’s drivers want such cars. [13] Based on Google’s accident reports, their test cars have been involved in 14 collisions, of which other drivers were at fault 13 times, although in 2016 the car’s software caused a crash. [12] Due to smart highways and with the assistance of smart technological advances implemented by policy change, the dependence on oil imports may be reduced because of less time being spent on the road by individual cars which could have an effect on policy regarding energy. [12] The Distronic system was able to adjust the vehicle speed automatically to the car in front in order to always maintain a safe distance to other cars on the road. [12] The intersections will have no traffic lights and no stop signs, instead using computer programs that will communicate directly with each car on the road. [12]

“Self-Driving Car Technology’s Benefits, Potential Risks, and Solutions”. theenergycollective.com. [12] “You can imagine a world where there are no stoplights and cars communicate with each other, cities are eco-fenced, and you can only take self-driving taxis. [28] “Self-driving cars programmed to decide who dies in a crash”. [12] Autonomous prototype cars appeared in the 1980s, with Carnegie Mellon University’s Navlab and ALV projects funded by DARPA starting in 1984 and Mercedes-Benz and Bundeswehr University Munich’s EUREKA Prometheus Project in 1987. [12] Autonomous racing car on display at the 2017 New York City ePrix. [12] “Autonomous cars – when will they take over?”. automobilesreview.com. [12]

Some people believe that they will increase car ownership and car use because it will become easier to use them and they will ultimately be more useful. [12] “How and why do men and women differ in their willingness to use automated cars? The influence of emotions across different age groups”. [12] The survey also gave results on potential consumer opinion on interest of purchasing an automated car, stating that 37% of surveyed current owners were either “definitely” or “probably” interested in purchasing an automated car. [12] In 2016, a survey in Germany examined the opinion of 1,603 people, who were representative in terms of age, gender, and education for the German population, towards partially, highly, and fully automated cars. [12] A very visual example of the moral dilemma that a software engineer or car manufacturer might face in programming the operating software is described in an ethical thought experiment, the trolley problem : a conductor of a trolley has the choice of staying on the planned track and running over 5 people, or turn the trolley onto a track where it would kill only one person, assuming there is no traffic on it. [12] People are still worried about safety and mostly the fact of having the car hacked. [12] “Driverless cars face cyber security, skills and safety challenges”. www.v3.co.uk. [12] In a 2011 online survey of 2,006 U.S. and UK consumers by Accenture, 49% said they would be comfortable using a “driverless car”. [12] An autonomous car (also known as a driverless car, self-driving car, and robotic car ) is a vehicle that is capable of sensing its environment and navigating without human input. [12] BMW’s all-electric autonomous car, called iNext, is expected to be ready by 2021; Toyota’s first self-driving car is due to hit the market in 2020, as is the driverless car being developed by Nissan. [12]

In a 2014 U.S. telephone survey by Insurance.com, over three-quarters of licensed drivers said they would at least consider buying a self-driving car, rising to 86% if car insurance were cheaper. 31.7% said they would not continue to drive once an autonomous car was available instead. [12] In December 2015, Tesla CEO Elon Musk predicted that a completely autonomous car would be introduced by the end of 2018; in December 2017, he announced that it would take another two years to launch a fully self-driving Tesla onto the market. [12] A funny thing happened – the problem turned much more difficult that even Google could manage and there is still no viable fully autonomous car on the horizon even after years and billions spent trying. [15] Reduced traffic congestion and the improvements in traffic flow due to widespread use of autonomous cars will also translate into better fuel efficiency. [12] Control systems on autonomous cars may use Sensor Fusion, which is an approach that integrates information from a variety of sensors on the car to produce a more consistent, accurate, and useful view of the environment. [12] In June 2011, the Nevada Legislature passed a law to authorize the use of autonomous cars. [12] According to the law, the Nevada Department of Motor Vehicles (NDMV) is responsible for setting safety and performance standards and the agency is responsible for designating areas where autonomous cars may be tested. [12] In November 2017, Waymo announced it was testing autonomous cars in Phoenix, without drivers behind the steering wheels but with safety engineers in the back seats. [17] Arizona Governor Doug Ducey later suspended the company’s ability to test and operate its autonomous cars on public roadways citing an “unquestionable failure” of the expectation that Uber make public safety its top priority. [12] The potential benefits of autonomous cars include reduced mobility and infrastructure costs, increased safety, increased mobility, increased customer satisfaction, and reduced crime. [12] The first death of an essentially uninvolved third party is likely to raise new questions and concerns about the safety of autonomous cars in general. [12] In Isaac Asimov’s science-fiction short story, ” Sally ” (first published May-June 1953), autonomous cars have ” positronic brains ” and communicate via honking horns and slamming doors, and save their human caretaker. [12] Incidents such as the first fatal accident by Tesla’s Autopilot system have led to discussion about revising laws and standards for autonomous cars. [12]

This marks the first time an individual outside an auto-piloted car is known to have been killed by such a car. [12] In 2005, Mercedes refined the system (from this point called “Distronic Plus”) with the Mercedes-Benz S-Class (W221) being the first car to receive the upgraded Distronic Plus system. [12] The alliance between French companies THALES and Valeo (provider of the first self-parking car system that equips Audi and Mercedes premi) is testing its own system. [12] The world’s most accomplished battery inventor says he has a new cell aimed at electric cars that delivers double the energy density of existing lithium-ion, and, in a first, actually achieves an increase in capacity when it’s charged and discharged. [13]

In March 2017, an Uber test vehicle was involved in a crash in Tempe, Arizona when another car failed to yield, flipping the Uber vehicle. [12] Apple is currently testing self-driven cars, and has increased the number of test vehicles from 3 to 27 in January 2018. [12] At this level the car can act autonomously but requires the full attention of the driver, who must be prepared to take control at a moment’s notice. [12] Any car in which the driver can pretend a crash wasn’t his fault. [29] Currently, at highway speeds drivers keep between 40 to 50m (130 to 160ft) away from the car in front. [12] There are many cars on the road, there are things that you have to deal with in the automotive cockpit.” [28] Among connected cars, an unconnected one is the weakest link and will be increasingly banned from busy high-speed roads, predicted a Helsinki think tank in January 2016. [12] Why it matters : To the degree the survey is accurate and reflects a broad global trend, everything from the world’s sprawling car industry to roads and cities themselves could be on the cusp of a fundamental transformation. [13] The car attempted to drive full speed under the trailer, “with the bottom of the trailer impacting the windshield of the Model S.” [12] Google is developing a variant called SLAM, with detection and tracking of other moving objects (DATMO), which also handles obstacles such as cars and pedestrians. [12] Google stated, “In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision.” [12] In January 2017, the NTSB released the report that concluded Tesla was not at fault; the investigation revealed that for Tesla cars, the crash rate dropped by 40 percent after Autopilot was installed. [12] According to the NHTSA, preliminary reports indicate the crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway, and the car failed to apply the brakes. [12]

It can be cycled more than 23,000 times, while the typical electric car battery cycles 1,000 times. [13] What every car should have, if anyone cared about real safety. [29] Men felt less anxiety and more joy towards automated cars, whereas women showed the exact opposite. [12] Both India and China have placed bans on automated cars with the former citing protection of jobs. [12] A car’s computer could potentially be compromised, as could a communication system between cars. [12] If you’re a legacy car company, your dealers have gotten all their money from maintenance, your assembly people are skilled in mechanical assembly, and your purchasing people have long-term relationships that now have to be fractured.” [28] Iyad Rahwan, an associate professor in the MIT Media lab said, “Most people want to live in a world where cars will minimize casualties, but everyone wants their own car to protect them at all costs.” [12] “Uber puts the brakes on testing robot cars in California after Arizona fatality”. [12] The film Total Recall (1990), starring Arnold Schwarzenegger, features taxis called Johnny Cabs controlled by artificial intelligence in the car or the android occupants. [12] The law also acknowledges that the operator will not need to pay attention while the car is operating itself. [12] “The way to think about it is that you have a car that can travel 200 miles, and after five years it can go 800 miles,” said Venkat Viswanathan, an assistant professor at Carnegie-Mellon University. [13] You’ve already got $3,000 worth of software in there, and electronics are becoming a huge amount of the car. [28]

Even with all the amazing progress in AI, such as self-driving cars, the technology is still very narrow in its accomplishments and far from autonomous. [14] It is they who are now leading the charge, launching production models with progressively more limited incremental autonomous driving capabilities, forcing big tech companies to partner with them, admitting that producing self-driving cars by themselves is too big a bite. [15] Industry analysts look toward self-driving cars for ride-hailing applications and autonomous semi-trailer trucks as the vanguard of autonomous driving. [28]

Next-generation high-speed cellular networks. 5G is not necessary for truly autonomous self-driving car–i.e. not reliant on external infrastructure and capable of functioning in any and all conditions–but absolutely essential for the profitable monetization of passengers in a self-driving car. [29]

RANKED SELECTED SOURCES(30 source documents arranged by frequency of occurrence in the above report)

1. (152) Autonomous car – Wikipedia

2. (19) How to think about AI and machine learning technologies, and their roles in automation – O’Reilly Media

3. (17) Everyday Examples of Artificial Intelligence and Machine Learning

4. (16) Semiconductor Engineering .:. Progress And Chaos On Road To Autonomy

5. (15) Deep Learning Use Cases – DATAVERSITY

6. (15) Safeguarding autonomous vehicles: The role of AI | Automotive IQ

7. (15) Inside Yandex self-driving car: Here’s what it’s like to ride on Moscow’s crazy roads | ZDNet

8. (14) Introduction to Deep Learning: Part 1 | AIChE

9. (13) Axios Future – June 3, 2018 – Axios

10. (13) Is There a Smarter Path to Artificial Intelligence? Some Experts Hope So – The New York Times

11. (10) Machine Learning Benchmarks and AI Self-Driving Cars – AI Trends

12. (10) GitHub – mbadry1/Trending-Deep-Learning: Top 100 trending deep learning repositories sorted by the number of stars gained on a specific day.

13. (10) The AI winter is well on its way | VentureBeat

14. (8) An Honest Glossary of Terms Relating to Self-Driving, Mobility, Tesla, and More – The Drive

15. (7) How to Do Distributed Deep Learning for Object Detection Using Horovod on Azure | Machine Learning Blog

16. (6) Deep Learning Enables Cancer Diagnosis Via Ultrasound | NVIDIA Blog

17. (5) NVIDIA Brings Tensor Core AI Tools, Super SloMo, Cutting-Edge Research

18. (5) What do machine learning researchers think about the obsession deep learning aficionados have with regards to differentiable models? – Quora

19. (3) Advancements in Dynamic and Efficient Deep Learning Systems – insideBIGDATA

20. (3) What is recurrent neural networks? – Definition from WhatIs.com

21. (3) GOOG Archives – ARK Investment Management

22. (3) Why Microsoft’s big bet on deep learning could go bad | Computerworld

23. (2) Semantic segmentation — Udaity’s self-driving car engineer nanodegree

24. (2) AI winter Addendum Piekniewskis blog

25. (2) Machine Learning Engineer | Udacity

26. (1) Neurala Jobs, Office Photos, Culture, Video | VentureFizz

27. (1) AI with AI | CNA

28. (1) Synced | AI Technology & Industry Review

29. (1) Self Driving Cars Archives – AI Trends

30. (1) Uber at CVPR