-
Notifications
You must be signed in to change notification settings - Fork 190
2014 Invensense Developer's Conference
I attended the 3rd Invensense Developer's Conference June 11 and 12, 2014. Here is a brief report of what I heard and saw.
Firstly, the Santa Clara Convention Center venue was convenient and appropriate for the gathering of about 350 people from all over the world interested in the applications of motion sensing. The hospitality and organization were superb, and there was plenty of time for networking. The conference content was engaging and several tutorial tracks were available over the one-and-a-half-day conference tailored to specific interests. I attended the developers track, so I'll have little to say about the wearable, marketing, and audio tracks. There were presentations on all aspects of Invensense's product lines as well as demos and displays of many of the current applications of this technology by Invensense and the many independent developers in attendance. The keynote talk by Invensense CEO Behrooz Abdi emphasized the idea of ubiquitous motion sensing, with the vision of all devices in the 'internet of things' being motion aware. There were several product announcements made at the conference and I got the sense of what are the outstanding challenges in the field of motion sensing. The following are the highlights from my perspective.
There was a developer display area between the main hall and the tutorial rooms where refreshments were regularly served, demos took place, and detailed discussion and networking were frequent. Everyone was friendly and eager to talk about their particular applications. There was a strong showing by those developing wearable applications for medical diagnostics, fitness, and sports. There were also several applications targeting immersive gaming and other visual perception applications.
Invensense provided a detailed tutorial on their newly announced ICM206XX family of motion sensors which is pin compatible with the current flagship motion sensor MPU-9250. The new device will combine an accelerometer and gyroscope with an on-board digital motion processor (DMP), similar to the MPU-6500. However, the ICM206XX family will have an improved DMP that will allow DMP fusion of 9-axis data (3-axes from an external magnetometer) with ultralow power consumption of <2 mA. The intent is to make device operation independent of microcontroller or operating system with an "open-source" approach for all firmware (application programming interfaces or APIs, not the proprietary DMP coding). This is in contrast to the current situation with the MPU-9x50 family where 9-axis fusion must take place in the host microcontroller; the MSP430 is the only microcontroller currently supported for 9-axis fusion by Invensense's firmware. Additionally, improved FIFO capability will automatically sync the sensors sample rates to avoid collisions and ensure accurate timing and, therefore, more accurate motion fusion output. Overall, this new sensor promises to improve the ease of use of Invensense motion fusion technology and promote Behrooz Abdi's vision of ubquitous motion sensing by accomodating a large variety of microcontroller platforms.
The other new device announcement was the ICM207XX family, which can be thought of as an ICM206XX but with an embedded pressure sensor. Unfortunately, this device will not be package and pin compatible with the MPU-9250. Both of these devices should be made available to developers by the end of Summer 2014.
One technical note of interest in the presentations on the new devices and FIFO technology was the observation by Invensense that most of the applications they see by developers use a direct reading of the data registers rather than using the FIFO. Accumulating data in the FIFO while the microcontroller is idled and burst reading on hardware interrupt from the FIFO can save power, but the main benefit is better data timing registration. Sequentially reading the data registers can result in misalignment of the data in time that can introduce additional error in the time derivatives of the sensor fusion algorithms. By making use of the FIFO, all three data sources are time stamped at the FIFO sample frequency and, therefore, are time aligned. I admit to being one of those who has been reading directly from the registers, but the Invensense case for using the FIFO is compelling.
Invensense also announced a new Motion Driver 6.0 package available at the end of June that will provide 9-axis sensor fusion for any ARM microcontroller, not just the MSP430, for use with MPU-9150 and MPU-9250 sensors. The DMP will still only perform 6-axis sensor fusion but the 9-axis sensor fusion will be provided by APIs that should run on any ARM device. This is also a significant improvement in capability for application developers that don't want to or can't develop their own sensor fusion algorithms.
There was an excellent tutorial on hardware considerations for designing printed circuit boards (PCBs) using the MPU-9250. This is a current project of mine and I was particularly pleased to have many of my questions directly addressed in the presentation. Invensense apparently wants and expects application developers to incorporate their devices into custom hardware designs. While the technology to design and make PCB's is mature, one of the most frequent difficulties I heard expressed at the conference was how to create the hardware platform for a particular motion sensing application. I am glad Invensense is trying to help developers in this regard and I would encourage more of this type of tutorial in the future.
Lastly, I learned that one of the biggest outstanding challenges in motion sensing is using 9-axis motion sensors as inertial guidance units for not only absolute orientation, which is a solved problem, but absolute positioning as well. The latter is extremely difficult because deriving position from an accelerometer involves, simplistically, a double integration of the sensor data. This means that even the smallest error will add continuously until, in a fairly short time, the position will be unacceptably off unless re-calibrated periodically by an external source of information such as a blue tooth nodal position, cellular tower triangulation, GPS signal, etc. For short distances, say, within a shopping mall, relative distances from the entrance via a motion sensor coupled with an altimeter might be enough to find the exit when one is done shopping. However, how can one design a small, low power device (embedded in a smart phone or tablet) that relies on a 9- or 10-axis motion sensor alone for absolute orientation and positioning for arbitrarily long times? This is the challenge...
In summary, I enjoyed my first (this was the 3rd) Invensense Developer Conference and I look forward to attending next year. Perhaps you will see me there at one of the developer showcase booths with my own motion sensing applications!