In recent years, it has become apparent that software is the driving technology in the computer industry. From the evolution of the UNIX and Windows platforms, we have seen that applications and markets are defined by software, rather than by the hardware in place, as had been the case with earlier platforms like the mainframe, VAX and AS400. Moreover, we have been told in our interviews that as much as 70% of the cost of developing a new hardware system is actually spent on software development. Our research also confirms that 80% of the lifetime cost of purchasing and upgrading a computer system is actually spent on software purchase, license, development (in-house or outsourced) and maintenance. Even microprocessor design, once a CAD-based activity, is now largely a coding and debugging process using a special-purpose software language called a hardware definition language. As processor and peripheral manufacturing move to the Third World, we recognize that the future of the US computer industry lies in design, networking, telecommunications and interactive content, all based on sets of complex software technologies. On the downside, we also see that major computer industry problems such as interoperability, systems errors and project failures are in large part software problems. Thus, our examination of the computer industry must include a careful measurement and analysis of the software industry. Through interviews, student research and small-scale surveys, we are well along in the process of identifying the key issues - business, public policy, and educational issues, that have shaped the software industry. As we continue to investigate how these issues will influence the future development of the industry, we have also established an extensive, multi-faceted dialogue with software industry leaders and other interested groups in the US and abroad to augment our research activities as well as to disseminate our research findings most effectively.