We moved to another office nearby that has significantly more space. Now we can also do more indoor tests!
Had a fun evening presenting our poster of our recent work on the Autonomous Wheelchair at the Ottawa Artificial Intelligence & Machine Learning Meetup June 26, 2019 Meetup and running a few video demonstrations.
We are pleased to welcome our Summer 2019 Intern, Harjap Gill, from the Carleton University Computer Engineering program . Harjap will be working on the Simulator, and adding features so it can be used to develop & test our artificial neural network models specific for the wheelchair.
Was honoured to present a Lightening Talk at the May Machine Learning Meetup Group in Montreal about our Proof of Concept Prototype that we built last year to test out our End to End Model via Imitation Learning .
There have been quite a few technical developments in progress over these past few months to talk about … all towards improving our autonomous wheelchair. Here are some of the highlights:
Data Collection Box: Ready for Validation Start
The “fuel” at the heart of our Deep Learning algorithms being developed for our autonomous wheelchair is data! To date, here at Blue Horizon AI, we have been using open data sets that are being used in the development of self driving cars, to further develop the “Perception” module that drives the wheelchair. However, wheelchairs drive in less structured environments then cars, such as on sidewalks, pathways, and indoors. Moreover, wheelchairs drive more like a tank then a car. Thus we need additional and better data that is more specific to the unique nature of actual wheelchair driving in order to better train the Perception/Auto function, with the end goal to make performance improvements in our “auto-pilot” wheelchair function.
Toward this end, we have completed phase 2 of a Data Collection Box that is mounted on the wheelchair, and powered by the wheelchair battery. This data collection function, built on low cost hardware, such as a PI3, collects raw colour image video data & joystick driver data. A post data collection process then converts this raw data into appropriate data set to be used in the further offline training of the artificial neural network being used to drive the wheelchair autonomously. Our data collection box is now undergoing intensive robustness testing to insure that it can operate for long periods of time on a wheelchair to securely collect valuable & correct data under a variety of conditions and lengths of time.
Nvidia Jetson TX2: AI Computing on the “Edge”
We have ported one of our prototype autonomous wheelchair hardware platforms to the Nvidia Jetson TX2 ( https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-tx2/ ) . This platform provides us with a high compute GPU function in a portable low power & low cost method so we can run more advanced artificial neural network models with the goal to be able to self-drive the wheelchair with higher performance results. Additionally, it also provides us additional compute capability for developing & training artificial neural networks, along with our in-house GPU compute platform. Initial preliminary testing has been completed indoors, and more extensive lab & outdoor testing is planned for later.
Middle-Ware Software Improvements:
The middle-ware part of our system is that which “glues” together the Perception/Auto intelligence function with the low level hardware that sends electrical signals to the wheelchair motor control system & that also manages the incoming video image stream from the camera, and other sensor information. The middle-ware system is also responsible for managing the data collection function, and for providing the user interface to both start & stop the data gathering & autonomous driving functions, and to also provide status of the system to the user.
A set of software improvements to our middle-ware system have been recently completed to improve ease of use for data collection & testing of the autonomous functions, as well as to support portability to other hardware platforms, thus laying the ground work to allow eventual open-sourcing of this autonomous wheelchair project.
We have had two additional staff members join us to further assist us on the autonomous wheelchair project.
Najmeh Taleb, a former Carleton University computer science PhD graduate, joined us in September 2018 for Deep Learning Training & now in January as a “Deep Learning” researcher. She will be focusing on the AI vision learning algorithms used in the wheelchair auto-pilot function to gain performance improvements.
Arjun Bhatti, a 3rd year electrical engineering student at Carleton University, joined us in February as our Technical Intern. Arjun will be helping with making improvements to the Arduino board hardware and software for the autonomous wheelchair project & with extensive lab & field testing being planned.
A big welcome to Najmeh and Arjun!
Had an amazing week networking & learning about all the latest research advances made in Artificial Intelligence and Machine Learning at the NeurIPs Conference. This is one of the largest artificial intelligence (AI) and machine learning conferences of the year & was held in Montreal with many others attending from around the world.
With the whirlwind of talks, tutorial, workshops, demonstrations, spotlight sessions, and posters, it was like trying to get drink of water from a fire hose! I can’t wait to try out some of those ideas this upcoming year!
Hectic month moving out of our old lab/office to a new larger location nearby.
Was thrilled to have been invited to the attend the North America Women Techmakers Leads Summit hosted in Google New York City office on October 13th. Met some amazing women who are leading all sorts of areas in technology from across North America! Thank you to Google for hosting!