Archive for the ‘Uncategorized’ Category

Out of the Bottle

Sunday, September 16th, 2001 Michael Barr

The genie is really out of the bottle this time. As first reported in Komsomolskaya Pravda newspaper and later by ABCNews.com, Russian engineer and entrepreneur Dmitri Zhurin recently invented a talking bottle cap. This is a bottle cap that looks like any other, but houses a tiny battery-powered embedded system that speaks to those gathered around the bottle for a drink.

Why would anyone want a talking bottle cap? For several reasons, according to the inventor. First, because Russians like to drink, but don’t like to drink alone. Initially, the voice in the cap offers only generally helpful instructions like “pour.” As additional drinks are taken from the bottle, however, the cap’s performance gets livelier, ultimately providing a friendly group of incorporeal drinking companions. The second reason for a talking bottle cap is that it can help out with the toasting duties. It seems Russians mostly drink in communal rounds at parties, with a toast preceding each round. Coming up with a large number of toasts in one evening can be a real challenge for the host or hostess.

Clearly, these problems are so compelling that the market had to respond with a solution—hence Mr. Zhurin’s Vodka Genie. Now, whether or not consumers will actually buy the Vodka Genie is an interesting question. What’s more interesting to me, though, is that here’s yet another place where no one expected computing power to turn up—and yet it has.

>It would be a major understatement to say that in 1943, when Thomas Watson commissioned the oft-cited study that concluded there was a total world market for five computers and that IBM would make all of them, no one could have imagined computers inside bottle caps. The first commercial microprocessor wouldn’t even be developed for three more decades!

Yet look at where we are less than three decades after the invention of that first microprocessor. We are literally surrounded by computing engines—the vast majority of them unrecognized as such. And this is only the beginning of the next era in computing—the embedded era.

>So much attention is focused on developments at the high-end—cheaper 32-bit processors, entire systems-on-a-chip, increasing memory budgets, connectivity to the world, millions of lines of code, the need for better development languages and debug tools, and increasing use of off-the-shelf software components—that it’s easy to forget that there are ongoing developments at the low-end of the spectrum too.

Processors of the 16-, 8-, and 4-bit varieties get cheaper every year too. New family members add more on chip memory and peripherals for no additional cost. Some even include specialized capabilities like speech synthesis, for niche applications like talking dolls bottle caps. All for just pennies a chip, at the current 4-bit price-points.

The point I’m trying to make is that 4- and 8-bit micros will never be replaced. In truth, the number of new opportunities for simple 4-bit micros is expanding at a much faster rate than the number of new uses for 32-bitters. And it only takes a single engineer and a few months to design and build a disposable product like the Vodka Genie, which could very well sell millions before this reaches your mailbox.

NOTE: this article was originally published on 6/2/01.

Obituaries

Saturday, September 15th, 2001 Michael Barr

Ergo, Audrey. Entered into eternal cancellation on Wednesday, March 21, 2001. Much-ballyhooed simplifier of modern everyday life; beloved step-sister of Kerbango, also cancelled; preceded in cancellation by Netpliance’s i-opener; survived—so far anyway—by Compaq’s iPAQ, Honeywell’s WebPAD, and parent 3Com’s “core businesses.” In lieu of flowers, the developers request that memorial contributions be made to Nasdaq:COMS.

Depressing isn’t it? Coincident with the announcement of its latest quarterly results, 3Com quietly stated that it would “discontinue its consumer Internet Appliance product lines.” In other words, no more Ergo Audrey home “nerve center” and no more Kerbango “Internet radio.” Even at a time of such opportunity for the emerging Internet Appliance market, the need to satisfy a shaky Wall Street managed to kill off two of the more promising new devices of that genre.

3Com didn’t elaborate on its rationale; perhaps initial sales of Audrey were disappointing. However, five months and four days seems hardly long enough to call any product a failure—at least not in a market poised for a “five-year growth rate of 73 percent” (words used to launch Audrey on October 17th, 2000 and attributed to research by analysts at Cahners In-Stat). The award winning, though much delayed, Kerbango 3Com spent $80 million to acquire less than eight months ago was not even offered a chance to test its market for one day.

One obstacle for such early Internet Appliances is certainly price. Whether they promise a simpler way to access e-mail; schedule and address book synchronization; wireless Web access; electronic books; thousands of channels of digital music; digital video recording; still image capture; or some combination of features from that list, the price of is invariably higher than the general-purpose computers they still very much compete with.

In a period of free-after-rebate 700MHz PCs, the entry price for Audrey (based on a 200MHz National Geode GX1 processor with 16MB of Flash, 32MB of RAM, and the QNX RTOS) was a whopping $499; and it cost $50 more to change the color of the plastic; more still to get a USB Ethernet adapter (for those with broadband connections). All that to accomplish a set of tasks any dunderhead could configure a four-year-old Pentium-90 system to do: e-mail, Web browsing, and synchronization of a PDA with an address book and calendar.

>Sure Audrey was more compact than a PC (not much larger than its 6¼” x 4¾” color touch screen, for those who haven’t seen one). She was also far easier to use and maintain. But a PC can also play games, offers word processing and personal finance software, and oh-so-much-more. It’s hard to compete.

If individual Internet Appliances are to succeed, and I believe many will, they need to find niches that aren’t being filled well right now. And they need to fill them at attractive prices. Check out Kodak’s new mc3 for a perfect example. At Audrey’s launch last October, 3Com asserted that its Internet Appliance product line would offer “the best of the Internet in a convenient and intuitive way.” They forgot to add, “for a limited time only.” That’s too bad. 

Audrey and Kerbango will both be missed; we hardly knew them.

NOTE: this article was originally published on 5/18/01.

Embedded Head Count

Friday, September 14th, 2001 Michael Barr

“Does anyone know how many embedded engineers there are in the U.S.?”

That was the question posed in the comp.arch.embedded newsgroup last August. My first thought was “no, nobody really does.” But then I saw that another poster, a frequent contributor to the group, had already responded:

“There are presently over 2.4 million ‘Embedded Engineers’ in the U.S. This figure increases 21% to 35% annually.”

The author quoted an analysis he had done himself as the source of this information and linked to his website. But when I went to his website I couldn’t find any material related to this topic at all. So I posted a response stating that I felt his figure was way off the mark and asking to see his analysis.

If there were really 2.4 million “embedded heads” in the U.S. we’d account for approximately 1 out of every 110 Americans. While a figure like that is not out of the question for certain occupations—police officers and teachers are even more prevalent in our society—it seems at least an order of magnitude too big for embedded engineers. One out of every 1100—about 250,000 individuals—seems far more reasonable.

Needless to say, after some back and forth (both public and private), it became clear that the guy who seemed so confident that there were 2.4 million of us and so eager to announce this to the world, couldn’t give any actual justification for his numbers. At one point he mentioned that he “based the results on figures obtained from the U.S. Bureau of Labor Statistics.” (He claimed to “have the raw data broken out somewhere”, but never did provide anything of the sort.)

Judging from two-year old statistics I found on the BLS website (stats.bls.gov), 2.4 million might be an accurate figure for the entire category of “computers/hi-tech” workers. But that group also includes every computer programmer, database analyst, sysop, network administrator, web developer, webmaster, and many others. We are certainly far outnumbered even within that subpopulation.

So what is the true number of embedded engineers? I don’t know; probably nobody does. What I can tell you is that an extrapolation of 1992 data from the National Science Foundation could put the “number of engineers [of all types] in manufacturing” at over a million, and that BLS reported about 350,000 “electronic/electrical engineers” in 1998. IEEE-USA has about 220,000 members (including students), the ACM has “over 80,000 worldwide”, and ESP has 60,000 qualified subscribers. My book aimed at embedded newbies has sold about 20,000 copies in two years—making it an all-time best-seller in the embedded category. Finally, about 10,000 of us (not counting exhibitors) are expected to attend the Embedded Systems Conference in San Francisco in April.

Mulling over all of these numbers and considering the likely weight of each to the calculation, my mind keeps coming back to the figure of 250,000. Though by no means scientific and probably off by as much as 20% one way or the other, that’s about the best estimate I can give you today.

I’m not even going to discuss the ludicrous annual growth rate suggested by the same poster. By no surprise, this guy is now in marketing.

NOTE: this post originally published 3/10/01.

21st Century Blues

Thursday, September 13th, 2001 Michael Barr

Let me be the first to properly welcome you to the 21st century and the new millenium. Just one short year ago, it seemed as though life as we know it (or at least computing as we know it) might grind to a halt on the false millennial-eve because of short-sighted engineering decisions made decades earlier.

Having earned my stripes in the embedded trenches, I was quick to tell anyone who asked that there was nothing to fear on New Year’s Eve 1999. “Embedded developers simply don’t build unneeded functionality, like calendars, into their systems,” I must have explained to a hundred friends and family. It seems I was right. The power stayed on; the water ran; no elevators stuck; no airplanes fell from the sky; traffic lights continued to control access to intersections; and Dick Clark remained on the air–the latter however unfortunately.

But these days I’m less confident in the embedded systems we entrust our lives and livelihoods to. It seems that everywhere I go vendors are encouraging the inclusion of unneeded functionality, and far too many developers are taking them up on it. Consider embedded Linux. While not so unreasonable a choice in a few specific classes of systems—like settop boxes or embedded PCs—Linux is clearly overkill in the vast majority.

How do you even begin to test the safety and reliability of a system with so much complexity and so many authors? Can systems made from a mish-mash of off-the-shelf software components and rushed to the production floor be trusted? Who will certify that these systems are worthy of deployment or purchase? And who will ensure that they are safe and reliable?

Looking back now, I wonder how anyone even found time in 1999 to fix date-related bugs and/or certify systems as “Y2K Compliant.” The U.S. economy has been running at full-speed for well-nigh a decade. The high-tech job market is hot and the amount of work for each engineer to do astounding. In such a climate, anyone halfway to a technical degree can find a job writing software for real products. Combine that with the pressures to get products to market quickly and you’ve got a clear recipe for disaster.

Surely, despite such horrible past disasters as Therac-25, the worst software-induced losses of life and limb lie ahead of us. We must, as an industry and to a person, insist on a higher standard of engineering. We must test our systems and design them to ensure their consistent behavior. Safety and reliability must be our first goals, not our last.

I implore all of you to raise the issues of safety and reliability within your own companies. Avoid unneeded functionality at all cost. After all, years or decades from now human lives or livelihoods may still depend on the engineering decisions you make today.

NOTE: this article originally published 01/01/01.

Open Source Embedded Software

Wednesday, September 12th, 2001 Michael Barr

Magazines like Embedded Systems Programming are founded on the ideal that the free flow of information between technological peers, even at competing organizations, benefits everyone. This is a view shared by the proponents of open source software. Whether it’s an operating system, a memory test suite, or simply a useful design paradigm, why should any one of us reinvent what has already been invented (and tested) by others in our profession?

Certainly, there are exceptional projects with exceptional needs. In such cases, invention of new techniques and technologies cannot be avoided. We can’t always combine existing components into a useful whole. However, such exceptional cases are rare. To take an example from childhood, you can build quite a lot of neat things with a basic set of Lego’s, but a “spaceship windshield” is a one-of-a-kind part for one-of-a-kind projects.

Most of the time, though, we’re all using the same basic components. Processors, operating systems, device drivers, and other building blocks are transferable from one company or project to another. Once developed, a technology or technique is, in fact, most valuable when it is shared with others. This benefits the creators too, as others may strengthen the component by finding and fixing flaws. This is the academic model of development, transferred to the commercial marketplace.

Unfortunately, there are many roadblocks to such openness. I recently ran into one of these when I tried to arrange a new point/counterpoint discussion. As envisioned, the debate would have addressed the differences in thinking and approach between two competing groups working toward real-time Java. A technological leader of each group was willing to participate and both thought this open debate would be a good way to uncover any flaws in one or both approaches. It might even have helped to bring the two groups together, if they found a high degree of overlap.

Plans for the debate were put on hold, though, after non-technical people on both sides expressed concern. It seems both groups are jockeying for position in the marketplace; both hope their approach will become the de facto standard for real-time Java. Though both groups acknowledge the need for open debate in the discussion of technology that could risk human lives and are conducting their own meetings publicly, they remain unwilling to debate each other.

In addition to the feeling of frustration at the intrusion of non-technical issues and people into what should have been a purely technical discussion, I also have a feeling of dread. I suspect that progress toward a real-time Java standard will be slowed (or doomed to failure) as a result of the lack of discussion. How can either of two competing solutions to the same problem be adopted as a standard without such a debate occurring first? What are those of us outside the two groups to think if these two large self-proclaimed “expert groups” cannot achieve a common standard on their own?

As the open-source movement has caught on in recent years, there have been more and more attempts to create semi-proprietary “open” standards groups. Sun’s Java Community Process, which is behind one of the two real-time Java groups, is a prime example. But it’s not at all clear that today’s “open” standards groups are anything more than the old proprietary standards, disguised.

NOTE: this article originally published on 12/6/00.