Technology

Why Engineers Love the Smart City Works Actuator

So now it’s real! A fantastic Ribbon-cutting and Meet the Cohort event last Friday the 14th for the new Smart City Works Actuator at CIT, next door to our enormo2017-04-14 - SCW - 39 - DSC_9505usly successful and now four-year old cybersecurity accelerator, MACH37 (who also graciously hosted the event). The Governor came to get the 100 or so guests pumped up and glad to be Virginians. Thomas Smith, the Executive Director of the American Society of Civil Engineers spoke about our failing infrastructure and how the Smart City Actuator could play a role 2017-04-14 - SCW - 37 - DSC_9387in helping renew it. There actually was a ribbon, and the Governor was decisive in cutting it (look at the lever arms on those scissors!). And, in addition to civil engineers, we had electrical, mechanical, 2017-04-14 - SCW - 03 - DSC_9517transportation, and an aerospace engineer, computer scientists and data scientists, a materials scientist or two (graphene of course), and probably more. So why do all sorts of engineers love the Smart City Works Actuator? We can turn to the Laws of Physics for answers. Two laws that every engineer learns apply here:

F=ma, where a of course is acceleration,

and the formula for Kinetic energy (energy in action)

Kε=½m(v)**2

Now for our purposes we will let m represent the size of the mentor network, and v represent the volume of innovative companies the accelerator capacity can handle. By starting the Smart City Works Actuator, a has now become 2a, m has become 2m, and v is of course 2v. Substituting in our equations, and letting F represent the amount of fun we are having, any engineer can tell you the results:

2a*2m = 4F …four times the fun!

and

½[2m](2v)**2= 4Kε …four times the energy!!

Yes, its true. Engineers love the Smart City Works Actuator because, together with our MACH37 Accelerator, they can come and have four times the fun and experience four times the energy, all while helping build a better world. Q.E.D.

Of course the way we help Actuate a better world is by helping accelerate our innovative entrepreneurs, and the Smart City Works Actuator has some great ones!

IHT.   You no longer need to be a scientist to know whether your water is 2017-04-14 - SCW - 20 - DSC_9590safe. Using a patented new technology, Integrated Health Technologies’ Sensor BottleTM detects and relays water quality information to your phone to provide you with real-time peace-of-mind that the water you consume is safe to drink.  For cities, these bottles provide a crowd-sourced platform for real-time water quality detection and monitoring of municipal water systems.

UNOMICEDGE.  UNOMICEDGE is a Software Defined Network solution for securely 2017-04-14 - SCW - 51 - DSC_9580connecting the Cloud to devices at Network Edge.  It includes a network Hypervisor that not only enforces network security policies, but develops critical business and operational insights from user and device interactions. Smart cities rely on smart IoT devices at the Network Edge.  UnomicEdge not only reduces the cyber risk of IoT, but can provide valuable intelligence to make businesses and cities run smarter.

InfraccessInfraccess is powering up infrastructure investment by pr2017-04-14 - SCW - 24 - DSC_9570oviding easier access to trusted data so you can more efficiently discover investment opportunities, make quicker, better informed investments, and reduce overall investment risk. The Infraccess web-based workflow platform sources and transforms unstructured information into smart data and proprietary performance indicators to help unlock billions in investment opportunities in infrastructure.

Capital Construction Solutions.  Capital Construction Solutions creates mobile-based 2017-04-14 - SCW - 30 - DSC_9533risk management platforms for improving enterprise-wide accountability and transparency.  With Capital Construction Solutions deployed in the field, companies can immediately turn day-to-day operations into opportunities to reduce corporate liability, mitigate risk, and significantly increase profits.

 

 

PLANITIMPACT.   Design decisions can have significant and long-lasting2017-04-14 - SCW - 27 - DSC_9545 impact on business and environmental costs.  PlanITimpact has created a smart modeling platform to help building professionals better understand and improve performance, including energy, water use, stormwater and transportation, so owners, investors, and communities can better visualize project impacts and returns on investment.

GREATER PLACES.  Cities worldwide are investing in the next generation of buildings, infrastructure, transportation, and technology. But where can you turn to for readily finding the b2017-04-14 - SCW - 28 - DSC_9542est leading-edge solutions in this space?   GreaterPlaces creates a single web-based and mobile platform for bringing together the best ideas, inspirations, and practices for designing and governing cities—a marketplace and tools to connect people seeking ideas, products and services to transform cities worldwide.

Come join them and see what you’re missing!

 

All photos courtesy of Dan Woolley

 

How Do We Know the Actuator is Working? Part 2 – Government Programs

Follow us @CITOrg or @dihrie or this blog for current information on the new Smart City Actuator.

In the last post we looked at commercial accelerator/investment programs and presented a methodology and results allowing more or less direct comparison of outcomes for these programs. The original study was funded by the Federal Government however to look at how these commercial approaches compared to government innovation programs.

slide3

There are a number of well-known innovation activities across the Federal Government, including the Small Business Administration’s SBIR program (Small Business Innovation Research…check out CIT’s own Robert Brooke as an SBIR 2016 Tibbetts Award Winner!), the Defense Advanced Research Projects Agency (DARPA) and its programs, In-Q-Tel which is the intelligence community program that uses commercial-style practices to encourage government-relevant innovation, and others in the various military services, national laboratories, contractor activities and elsewhere.

Access to data on success rates and outcomes is as fragmented in the government domain as it is in the commercial world, in some cases compounded by classified work or other policies specifically designed to limit information access. But as in the commercial world, the best-known and most successful Government programs are not shy about sharing their results. Special commendation goes to the SBIR program and In-Q-Tel, both of whose data sets have proven invaluable.

The SBIR program identifies three phases of development, with Phase I generally less than 6 months and $150,000, Phase II generally less than $1,000,000 and 2 years, and Phase III externally (non-SBIR) funded, providing a reasonable basis of comparison in our model. The SBIR program publishes good data on the percentage of proposals and the amounts awarded at each Phase, allowing for a robust analysis, although the Government Accountability Office (GAO) did find that the data are insufficient to determine DoD SBIR transition success from Phase II to Phase III. One additional twist is that the Navy instituted a program from 2000 to 2015 called the Transition Assistance Program (TAP) to provide mentoring support to these early stage researchers, and that data is also available in at least one study looking at the period from 2005 to 2008.

DARPA was a bit of a surprise. When the GAO tried to assess the transition performance of DARPA projects they concluded that “inconsistencies in how the agency defines and assesses its transition outcomes preclude GAO from reliably reporting on transition performance across DARPA’s portfolio of 150 programs that were successfully completed between fiscal years 2010 and 2014.” Another study puts DARPA success rate at about 3-5 products to market per year over 40 years, which the authors characterize as “quite impressive.” Measuring transition success is clearly not a priority.

In-Q-Tel data were a bit harder to come by, but here we were able to use two sources: their published number on downstream funding leverage, and a calculated number based on external data about the size of the In-Q-Tel portfolio and additional published fundiReference model fullng events. Thus we were able to calculate a performance number and compare it to the published number, again as a check on the validity of the model. All of these results are shown in the Figure. The In-Q-Tel (IQT) data show reasonable correlation between published and calculated numbers depending where IQT falls on the investment spectrum, and also shows that the best Government programs perform in line with the best of the commercial programs.

What about the rest? A couple things seem clear. First, there is much more emphasis on activity than on outcomes in the Government R&D space…how many programs are funded versus how many of those funded programs succeed in eventually deploying to users. Given the rapid rate of change in technology and the fact that our national strategic competitors are looking very hard for strategic advantage, it is certainly in the U.S. national interest to have a robust scientific community actively researching a large number of areas of interest. In this domain, activity rather than outcomes may in fact be the right metric. Some of the focus on activity is also driven by the Government budget cycle process, and certainly if outcomes are not reliably known for 4-7 years as in the commercial world, this is beyond the next election cycle for most elected officials.

But in that subset where transition to users is important, perhaps even a stated goal, the Government programs seem to struggle. The fact that GAO could not determine transition success rates for either SBIR Phase III or DARPA is one indicator. Plenty of literature speaks of the “Valley of Death” in the Government world, where inventions go to die before ever being deployed.

Among other issues, there are structural reasons for this. The “market” for Government transition is generally Programs of Record, those big, often billion-dollar programs. Those programs run on an entirely different set of principles than the Government R&D world, a set of principles where risk reduction rules the day and innovation may not even be welcome. So most Government R&D programs and national laboratories now have “technology transition” programs or offices, looking to commercialize all those great inventions that have been developed along the way, in some case with associated patents.

The standard model for these efforts has been to look at the outcomes of the early stage R&D process and license the intellectual property, or try and find an entrepreneur who will take it on, or encourage the inventor to become an entrepreneur. Two problems plague this approach: intellectual property transfers much more effectively via people than via paper; and, inventions created and prototyped without the market discipline of external investors and users determining value are almost always poorly optimized for the eventual market they hope to serve.

The programs that have done best at this are those that adopt the most “commercial-like” practices: listen to your customers and end users, get early feedback, understand the needs, understand the price points, worry about transition to market. When GAO looked at a set of DARPA case studies, they summarized it this way.DARPA Factors for Success

The good news is that the Smart Cities Actuator instills the commercial version of exactly this set of principles. While the Federal government can focus significant investment on technology development, it seems that the best Government programs are about as good as the best commercial programs. The difference is not the amount of money but the set of practices that make transition effective.

Next (Monday 3/20): How do we know the Actuator is Working? Part 3 – Corporate/University Programs

What Is a Smart City?

Follow us @CITOrg or @dihrie or this blog for current information on the new Smart City Actuator.

As CIT and Smart City Works developed our new Smart City Works Actuator, this question kept coming up from just about everyone. Some people just asked. Others knew a few reference points: I know so and so city is doing smart parking meters or smart street lights or smart trash collection…is that what you mean? Still others referenced the technology components: do you mean broadband, or Internet of Things (IoT), or cybersecurity, or autonomous vehicles? A few asked the meta questions: will this improve resilience, will this enable the surveillance state, will this improve people’s lives?

The standard web definitions were not much help. Wikipedia has: “A smart city is an urban development vision to integrate multiple information and communication technology (ICT) and Internet of Things (IoT) solutions in a secure fashion to manage a city’s assets – the city’s assets include, but are not limited to, local departments’ information systems, schools, libraries, transportation systems, hospitals, power plants, water supply networks, waste management, law enforcement, and other community services. The goal of building a smart city is to improve quality of life by using urban informatics and technology to improve the efficiency of services and meet residents’ needs.” This is helpful, and brings Quality of Life to the table, but does not provide much guidance on how to build one, or how these many and varfs_gfx_smart-cities-concepts-v1ied pieces fit together.

Sarwant Singh, based on a Frost & Sullivan
study, provides a fairly typical definition, “We identified eight key aspects that define a Smart City: smart governance, smart energy, smart building, smart mobility, smart infrastructure, smart technology, smart healthcare and smart citizen.” Lots of smarts and interdependencies, but not much structure.

So we developed otriangle-defur own definition, one based loosely on the old communications stack model, where each layer of the stack depends on services provided by the layer below it. Note in this version we have explicitly included the 22 City Link™ platform at the Link layer, since an initial implementation of this vision will be with our partners at Gramercy District where the 22 City Link™ platform is being piloted; other communities may have different Link layer implementations.

Several things stand out in this working definition:

  1. It explicitly ties technologies up the stack to the people-centric use cases around improving quality of life
  2. It provides context for things such as data collection or cybersecurity or autonomous vehicles…we don’t want to do them just because we can, but because they achieve some goal in the context of a set of infrastructures. This context also helps open up questions along the lines of: what is the proper balance between the privacy of people transiting the urban environment, and data collection for use by retailers…who owns the data, what permissions are needed, can it be re-sold, how long can it be retained, etc.
  3. For the Actuator, it will help us help innovators understand where they fit in a larger picture, which will aid them in defining the boundaries of what needs to be included in their specific product offerings. Furthermore, this provides fodder for discussions of how to scale a product. It is fantastic to be able to demonstrate a product in the friendly, custom confines of Gramercy District, but proving that a product can scale, and is thus investable, requires that it function in a wide range of built environments and infrastructure, old and new.
  4. It can be useful in early “diagnostic” discussions…for developers the discussion includes topics like “what do the buildings look like”. For communities whose vision is something like “become a smart city” but are underserved when it comes to connectivity, it provides a starting point for a longer term strategic growth plan that begins with “first, get connected”. For larger platform companies it may help make the externalities explicit for ongoing product evolution and understanding the sweet spots and limitations for existing products.

Our collective understanding of Smart Cities is evolving rapidly as the innovation process and ecosystems begin to explode. Hopefully this working definition will provide a more stable framework for understanding where and how these innovations can ultimately serve to improve our quality of life.

Next (Monday 2/27): What Is an Actuator?

CSAM Back to Basics: Computers

Modern computers were invented around World War II. According to Walter Isaacson’s book The Innovators, these machines have four main characteristics. They are:

  • Digital – using discrete numbers rather than continuous parameters such as voltage levels
  • Binary – the particular underlying number scheme is based only on zeros and ones, not decimal numbers or some other choice. Turns out this matches well with electronic methods of computation.
  • Electronic – Well of course. But, at the time, many computing machines used electromechanical relays or  combinations of different types of devices. The invention of the transistor settled that one.
  • General Purpose – Alan Turing provided the theoretical background here. Yes, this is the same genius portrayed as breaking the German enigma codes in the movie The Imitation Game, and also the inventor of the Turing test for determining whether or not a machine exhibits intelligence, as portrayed in the movie Ex Machina and others. But the theoretical Turing Machine is perhaps his most enduring legacy (movies aside). Earlier work had shown that any system of logic has problems that it cannot solve, and that there is no way to determine a priori which ones are unsolvable. Turing was able to prove that his conceptual machine could solve all the rest, every problem that was computable. The machines we use today are logically equivalent to his Turing machine, and hence “general purpose” in a very strong sense of the term.

One other key figure in the early development of computers was John von Neumann. Among his many contributions was the realization that both data and computer programs could be represented as binary strings and thus could be stored in the same memory. Again this was not obvious at the time, but today’s machines are almost universally built on the “von Neumann architecture” of shared storage.

These basics have remained the foundation of computers for decades. Of course there have been dramatic changes, and not only as a result of Moore’s Law, describing the exponential increase in computational power over time. First, computer scientists figured out how to get single computers to run multiple programs “simultaneously”, making them more efficient and also allowing multiple users on machines. The von Neumann concept of a single storage component became increasingly specialized as memory chips evolved, with hierarchies of faster, more expensive storage near the computational engines and slower, cheaper memory further away. More recently we have seen the change from a single computing core to parallel processing with multiple cores (most new machines have 4 or more) along with specialized graphics engines and other auxiliary computational components. Each change in technology has also led to changes in the operating systems, the specialized software that controls how the basic computer functions are carried out.

Interesting history, but why is this a topic for cybersecurity awareness (apart from the obvious fact that hackers want to go where the data lives)? Basically, each increase in complexity is another opportunity, another potential vulnerability for someone to exploit.

Take the von Neumann architecture. Computers keep track of what instruction to execute next, and where to find the data for that instruction through the use of pointers. They let the computer know to go to address “abcd” for the next instruction and address “wxyz” for the next piece of data. But if someone finds a way to load a set of nefarious instructions that look like data and then change the pointer, the computer doesn’t know the difference…you’re pwned. Or, since we have multiple users on a single machine, if someone gains access as a normal user and then sets their privileges so the machine believes they are an administrator…poof, they’re an administrator. Even newer memory types, such as the Non-Volatile RAM that retains data when the power goes off, open up a whole new range of in-memory persistent hacks.

This is a recurring theme in cybersecurity (and elsewhere, for that matter). The systems we use have become so complex that they are beyond the capabilities of any individual to fully understand, and provide abundant opportunities for failure or exploitation. Even at the level of individual computers, the digital, binary, electronic string of events that takes place in the millions of “transistors” that make up the general purpose computational engine in today’s computers is hard to imagine…whether it is sending an e-mail, playing a game, or presenting your next great concept.

What can you do as a user, faced with this complexity? Four main things:

  1. Buy from reputable sources. The major choices for laptops are fairly limited, but there are many more suppliers of peripherals, and any of these devices can provide potential attack vectors.
  2. Make sure your machine is properly configured. This is good advice both for new machines and to check periodically on your existing machines. Microsoft, for example, created significant controversy when they shipped new Windows 10 machines this past summer, configured as a default to settings and apps that some users considered violations of their privacy. As a user you can generally check configuration settings both in the operating system and in your browser and other basic programs.
  3. Keep it updated. Maintaining software to current patch levels is becoming easier, but is still a time-consuming nuisance. Nevertheless, this is the primary method that software manufacturers use to correct recently discovered errors, and unpatched systems continue to be a primary exploitable vulnerability.
  4. Periodically clean up your machine. Delete unused apps. Defragment your drives. Backup your data. Update and run your anti-malware programs. This type of maintenance will make your machine run better, and also reduce your vulnerability to a variety of attacks and unpleasantries.

Happy computing!

CTO SmackChat: Technology is not Innovation

First posted 01/08/14 on MACH37.com

In his excellent book “The Idea Factory: Bell Labs and the Great Age of American Innovation”, Jon Gertner quotes Jack Morton, who worked at the Labs on the development of the transistor in the 1940s, saying “[Innovation] is not just the discovery of new phenomena, nor the development of a new product or manufacturing technique, nor the creation of a new market”, but all of these working together to deliver things that make a difference. Or, as one of our investors puts it succinctly: “a business without customers is just a hobby”.

As technologists, we of the nerdly persuasion tend to believe that the tech is the key ingredient in the success of any startup. At MACH37 we talk to a lot of incredibly smart technical people, some with potentially game-changing ideas…but, technology is not innovation. For a startup to deliver products that make a difference it takes a great technical idea, but also someone who knows how to build a business, someone who knows how to turn an idea into a product, and people who can find customers, understand their problems and sell them your idea. Innovation is a team sport.

So, how important is the tech? As we evaluate startups and talk to investors, a large majority consider it essential to have someone with deep technical domain expertise, as well as product development skills, as part of the initial entrepreneurial team. Many of those same people will tell you however that the initial technology contributes maybe only 10% or 20% to the success of the business, that the ability to pivot is critical, that technology almost never creates new market segments. My own rule of thumb is that your going-in idea is always wrong.

Making sense of the contradictions can be maddening…being passionate about your ideas but willing to turn on a dime; knowing what is necessary but not sufficient; being game-changing in a way that’s not too ground-breaking. This is the first of a series of posts to explore these contradictions from the technologist’s point of view. How many features make a product? When do you abandon Rev 1 and start over? When does one product become two? How do you know what customers really want? How far ahead of the market or the product can you be? And once you delegate the product design, and customer interaction and hands-on coding, how do you continue to add value to your organization?

David Ihrie is CTO of MACH37 and has been the lead technical person for six startup companies. He has a BS in EE/CS and an MS in Management specializing in the Management of Technological Innovation, both from MIT.