Event Recap: DC Energy and Data Summit

This is a guest post by Majid al-Dosari, a master’s student in Computational Science at George Mason University. I recently attended the first DC Energy and Data Summit organized by Potential Energy DC and co-hosted by the American Association for the Advancement of Science’s Fellowship Big Data Affinity Group. I was excited to be at a conference where two important issues of modern society meet: energy and (big) data!

There was a keynote and plenary panel. In addition, there were three breakout sessions where participants brainstormed improvements to building energy efficiency, the grid, and transportation. Many of the issues raised at the conference could be either big data or energy issues (separately). However, I’m only going to highlight points raised that deal with both energy and data.

In the keynote, Joel Gurin (NYU Governance Lab, Director of OpenData500) emphasized the benefits of open government data (which can include unexpected use cases). In the energy field, this includes data about electric power consumption, solar irradiance, and public transport. He mentioned that the private sector also has a role in publishing and adding value to existing data.

Then, in the plenary panel, Lucy Nowel (Department of Energy) brought up the costs associated with the management, transport, and analysis of big data. These costs can be measured in terms of time and energy. You can ask this question: At what point does it “cost” less to transport some amount of data physically (via a SneakerNet) than it does through some computer network?

After the panel, I attended the breakout session dealing with energy efficiency of homes and businesses. The former is the domain of Opower represented by Asher Burns-Burg, while the latter is the domain of Aquicore represented by Logan Soya. It is of interest to compare the general strategy of both companies here. Opower uses psychological methods to encourage households to reduce consumption. On the other hand, Aquicore uses business metrics to show how building managers can save money. But both are data-enabled.

Asher claims that Opower is just scratching the surface with what is possible with the use of data. He also talked about how personalization can be used to deliver more effective messages to consumers. Meanwhile, Aquicore has challenges associated with working with existing (old) metering technology in order to obtain more fine-grained data on building energy use.

In the concluding remarks, I became aware of discussions at the other breakout sessions. The most notable to me was a concern raised by the transportation session: The rebound effect can offset any gain in efficiency by an increase in consumption. Also, the grid breakout session suggested that there should be a centralized “data mart” and a way to be able to easily navigate the regulations of the energy industry.

While DC is not Houston, the unique environment of policy, entrepreneurship, and analytical talent give DC the potential to innovate in this area. Credit goes to Potential Energy DC for creating a supportive environment.

Political Tech: Predicting 2016 Headlines

Mark Stephenson is a Founding Partner at Cardinal Insights, a data analysis, modeling and strategy firm.  Cardinal Insights provides accessible and powerful data targeting tools to Republican campaigns and causes of all sizes.  Twitter:  @markjstephenson

The reliance on data in politics comes as no surprise to those who watch trends in technology.  Business and corporate entities have been making major investments in data analysis, warehousing and processing for decades, as have both major political parties.  As the strategic, tactical and demographic winds shift for political operatives, so too has the need to become more effective at building high quality datasets with robust analysis efforts.

Recent efforts by both Republican and Democrat organizations to outpace each other in the analytical race to the top have been well documented by the press[1].  With the 2016 Presidential election cycle already underway (yes...really), I decided to make some headline predictions for what we will see after our next President is elected, as it relates to data, technology and organizational shifts over the next three years.


"Data Crunchers Analyze Their Way Into the White House"


A similar version of the headlines we saw in 2012, the growth in the reliance and seniority of data science staff will continue.  Senior members of both party's Committee and Presidential campaign staff will be technologists (this is already happening), and data science will be integrated in all aspects of those campaigns (ie. fundraising, political, digital, etc.).


"Digital Targeting Dominates the Targeting Playbook"


Studies continue to show shifts in how voters consume advertising content, including political messaging.  Television still remains a core tactical tool, but voters of all ages are increasingly unplugged from traditional methods[2].  One-to-one data and digital targeting will grow in the scope and budget it receives and both vendors and campaigns will shift tactical applications to respond to demands.


"Scaled Data: State Races Take Advantage of National Tools"


In 2016, not only will national, big budget races use data and analytics to glean insights, but these tools will scale to lower level, state-based campaigns.  Along with more widely available, cheap (even free) technology, companies like Cardinal Insights and efforts like "Project Ivy"[3] are turning what used to be expensive and time-consuming data analysis into scalable, accessible products.  These will have lasting efforts on the profile of many state House and Senate legislatures and as a result, state and local political outcomes.


"Business Takes Notice: Political Data Wizards Shift Corporate Efforts"


Just as many political operatives took skills learned and applied during the 2012 election and focused them on entrepreneurship, the same will happen to a higher degree after 2016.  Innovators in the political data, digital and television spaces will prove the effectiveness of these new tools and as a result, corporate marketing and advertising will seek them out.


"Shift from "The Gut" to "The Numbers" for Decision Making and Targeting"


Many decisions made by political operatives in the past were made from the gut:  their intuition told them that a certain choice was the right one, not necessarily a proven method, backed by data.  In 2016, there will continue to be a dynamic shift towards data-driven efforts throughout campaigns, with an emphasis on testing, metrics and fact-based decision making.  This will permeate all divisions of campaigns, from fundraising to operations to political decisions.

Just as companies like Amazon, Coca Cola and Ford build massive data and analysis infrastructures to capitalize on sales opportunities, political campaigns will do the same to capitalize on persuading voters.  As trends in data analysis, targeting, statistical modeling and technology continue to reveal themselves, you will read many headlines in late November 2016 that are similar to the ones above.  Keep an eye on the press to see what campaigns do in 2014 and watch the growth of a booming analytical industry continue to distill itself throughout American politics.





Calling all Coders! Code-a-Palooza Submissions Now Open

The Health Datapalooza 2014 Code-a-Palooza challenge is now open for submissions! Teams will use newly-released Centers for Medicare and Medicaid Services (CMS) data to create interactive data visualization tools to help consumers improve their health care decision-making.  Prizes totaling $35,000 will be awarded. HDP_logo-hi-res_RGB 

Code-a-Palooza Timeline:

  • Wednesday, April 9 – Code-a-Palooza opens for submissions
  • Friday, April 25 – Visualization proposals of no more than 750 words due
  • Friday, May 2 – Top five to ten finalists notified
  • Throughout May – Finalists build out visualization tools
  • June 1-3 at Health Datapalooza – Finalists present a live demo to a panel of judges and winners are announced

Why should you participate?

  • Code-a-Palooza applicants will be eligible for a discounted Health Datapalooza registration rate of $195
  • Finalists receive two complimentary registrations to Health Datapalooza 2014
  • Gain recognition for your team and network with leaders in healthcare
  • First place team awarded $20,000, second place $10,000 and third $5,000


To learn more about this year’s Code-a-Palooza challenge and submit your proposal, please visit the Health Datapalooza website.

Watch a video of last year’s winners, Hippocratic Code from Medstar Institute for Innovation, to learn more about the Health Datapalooza Code-a-Palooza experience.

Selling Data Science: Validation

FixMyPineapple2 We are all familiar with the phrase "We can not see the forest for the trees", and this certainly applies to us as data scientists.  We can become so involved with what we're doing, what we're building, the details of our work, that we don't know what our work looks like to other people.  Often we want others to understand just how hard it was to do what we've done, just how much work went into it, and sometimes we're vain enough to want people to know just how smart we are.

So what do we do?  How do we validate one action over another?  Do we build the trees so others can see the forrest?  Must others know the details to validate what we've built, or is it enough that they can make use of our work?

We are all made equal by our limitation to 24 hours in a day, and we must choose what we listen to and what we don't, what we focus on and what we don't.  The people who make use of our work must do the same.  John Locke proposed the philosophical thought experiment, "If a tree falls in the woods and no one is around to hear it, does it make a sound?"  If we explain all the details of our work, and no one gives the time to listen, will anyone understand?  To what will people give their time?

Let's suppose that we can successfully communicate all the challenges we faced and overcame in building our magnificent ideas (as if anyone would sit still that long), what then?  Thomas Edison is famous for saying, “I have not failed. I've just found 10,000 ways that won't work.”, but today we buy lightbulbs that work, who remembers all the details about the different ways he failed?  "It may be important for people who are studying the thermodynamic effects of electrical currents through materials." Ok, it's important to that person to know the difference, but for the rest of us it's still not important.  We experiment, we fail, we overcome, thereby validating our work because others don't have to.

Better to teach a man to fish than to provide for him forever, but there are an infinite number of ways to successfully fish.  Some approaches may be nuanced in their differences, but others may be so wildly different they're unrecognizable, unbelievable, and beg for incredulity.  The catch is (no pun intended) methods are valid because they yield measurable results.

It's important to catch fish, but success is not consistent nor guaranteed, and groups of people may fish together so after sharing their bounty everyone is fed.  What if someone starts using this unrecognizable and unbelieveable method of fishing?  Will the others accept this "risk" and share their fish with those who won't use the "right" fishing technique, their technique?  Even if it works the first time that may simply be a fluke they say, and we certainly can't waste any more resources "risking" hungry bellies now can we.

So does validation lie in the method or the results?  If you're going hungry you might try a new technique, or you might have faith in what's worked until the bitter end.  If a few people can catch plenty of fish for the rest, let the others experiment.  Maybe you're better at making boats, so both you and the fishermen prosper.  Perhaps there's someone else willing to share the risk because they see your vision, your combined efforts giving you both a better chance at validation.

If we go along with what others are comfortable with, they'll provide fish.  If we have enough fish for a while, we can experiment and potentially catch more fish in the long run.  Others may see the value in our experiments and provide us fish for a while until we start catching fish.  In the end you need fish, and if others aren't willing to give you fish you have to get your own fish, whatever method yields results.

ConnecTech, the DC Goverment, Small Business Innovation Research, and Data Community DC

One way that smaller and startup firms can become more specialized is to enhance their service offerings through research and development (R&D). While R&D typically requires internal investment, the federal government has a program in place that will award grants or contracts to small businesses to pursue R&D efforts on its behalf. The Small Business Innovation Research (SBIR) program is a highly competitive program that encourages domestic small businesses to engage in Federal Research/Research and Development (R/R&D) that has the potential for commercialization.

How It Works

The SBIR program awards contracts or grants in three phases to small businesses. Phase I is typically an award of $150,000 for twelve months to establish the technical merit, feasibility, and commercial potential of the proposed R/R&D effort. Phase II is typically an award of $1,000,000 for two years to continue the R/R&D efforts initiated in Phase I, usually toward a refined prototype. Phase III is full commercialization. The SBIR program does not fund Phase III work. However, for some federal agencies, Phase III may involve continuing, non-SBIR funded R&D or production contracts for products, processes or services intended for use by the U.S. Government.


The District of Columbia recently began a new program called ConnecTech that aims at engaging more District businesses with the SBIR program through a variety of offerings. Primarily, ConnecTech will provide training to entrepreneurs and companies interested in the SBIR program. The training sessions are focused on topics that will help better position firms for a successful Phase III transition even before submitting the Phase I bid. This includes teaching companies how best to identify topics that are likely to be transitioned to Phase III and selecting the right partners for the R&D effort.

Still Not Convinced?

Ultimately, with groups like ConnecTech offering SBIR support, there is little reason not to participate in the program. In addition to commercialization potential, here are three more reasons why your firm should consider the SBIR program:

  1. Non-Dilutive Capital: For startups and small companies taking on additional capital can mean dilution. SBIR funding is neither equity nor debt, so it is an excellent vehicle for companies to raise capital and further validate their business model.
  2. Reduced Overhead: SBIR efforts require a Principal Investigator (PI) to lead the research. Some firms believe this must be a person with an academic background and PhD; however that is not the case. Many times a firm’s Chief Technology Officer or a Sr. Engineer can serve as PI and have some of their hours allocated to the SBIR project instead of another cost center such as overhead.
  3. New Client Relationships and Commercial Work: SBIR projects can represent a way into new clients without the long window that is typically required. Additionally, a requirement of most SBIR bids is a commercialization plan that is focused on the private sector. As a part of developing the plan, the company will be creating a pathway to private sector business that can be expanded and create differentiation away from the federal market.

Topics for Data Community DC

Here is a list of current topics that are currently open from various agencies that may be interesting to Data Community DC blog readers:

The SBIR program leverages the agility and creativity of America’s small businesses and fosters the kind of innovation that business requires. It provides a pathway that allows entrepreneurial firms to create their next opportunity and cultivates a corporate culture focused on outside of the box thinking. With so many resources available to provide SBIR support, more and more companies are beginning to understand the full capabilities the program offers and yours should as well.

Please feel free to contact Philip Reeves, Manger of Small Business Technology and Innovation at DC Department of Small and Local Business Development with questions:



Data Unconference: The Sunlight Foundation's 5th Annual Transparency Camp

DC2 would like to invite you to Sunlight Foundation’s 5th annual TransparencyCamp on May 4th and 5th at the George Washington University’s Marvin Center, Washington, DC. Early bird registration for TransparencyCamp 2013 is still open until March 1, 2013, so register today!

For the last five years, we've gathered together a variety of journalists, policy creators, technologists, concerned citizens, academics, watchdogs, and others to build community, share best practices, and problem-solve challenges to work in the transparency arena. Last year, we hosted over 400 people from over 30 countries and 26 US states. This year, we’re expecting around 500 participants with even more participation from attendees across the country and abroad. Please check out for a preview of what to expect at this year’s unconference.

  • What: TransparencyCamp 2013
  • Where: George Washington University (Marvin Center) 800 21st St NW,  Washington, DC 20052
  • When: May 4-5, 2013

Also, lunch is provided by DC’s awesome food trucks. Click here to register for TransparencyCamp now.

Plus check out videos here and here from past TCamps and be sure to come with ideas, share the registration link with your friends, and tweet #TCamp13!