Harnessing Big Data with a Systems Thinking Approach – (A Harley Davidson Case Study)

With 90% of the world’s data created in the last two years, what can we expect our data vaults to hold two or even twenty years from now? Today we measure our lives in peta-bytes but by 2020 estimates show a 2,300% increase in the bits and bytes that will define our lives. 35 zeta-bytes to be exact. How then can we as a society leverage the intrinsic value of so much data without getting bogged down with its complexity?

Around the turn of the century, we experienced a similar moment of euphoria when retail outlets opened ‘virtual stores’ and sold products to online buyers. A famous IBM TV ad once depicted an overwhelmed young company whose products went from a few online orders a day to hundreds of thousands. In many respects we have come full circle and are back at the starting gate of yet another era of unprecedented growth only this time instead of millions of orders, the focus is on zillions of data points.

In 2000 CEOs focused primarily on IT integration and supply chain strategies to fulfill a surge of orders. Their managers implemented the latest e-commerce packages, leveraged the cloud to reduce costs, broadened and compressed their global supply chains, and trained their workforce to adapt new work flows. Success was determined from a customer’s positive experience, measured primarily by the number of accurate and timely deliveries.

Today, the paradigm has shifted away from a transaction centric one to customer centric. Companies no longer wait for customers to buy but instead develop sophisticated algorithms that can compare a specific customer’s purchase history with multiple data sets including credit rating reports, recent purchases, and most extraordinarily, their genuine propensity to buy based upon the web pages they most commonly visit. Surprisingly, web behavioral data has become a powerful data complement that can offer unprecedented efficiency benefits to both the merchant and the consumer. Customers receive compelling suggestions, while stores inventory the products their customers will most likely purchase. It’s a win-win for both. Issues of privacy remain a sticking point for some individuals, but, as the benefits to the consumer improve, even these issues are expected to become less significant.

Striking the optimal balance will be tricky especially when the journey also involves flogging through mounds of unstructured web data. One approach being talked up within academic circles is systems thinking.

MIT’s SDM Conference – (sdm.mit.edu)
At a recent Systems Design Management (SDM) conference at MIT called “A Systems Approach to Big Data: Going Beyond the Numbers”,  Senior Lecturer J. Bradley Morrison greeted a packed audience with a refresher on Systems Dynamics; the study of how all the various components within a company (people, materials, contracts, etc), for example, interact and react together to create a product or service. Morrison’s ‘Back to the Classroom’ exercise offered new insights on how the principles of ‘systems thinking’ that today help companies scale their global operations can also be applied to leverage the new era of big data. His explanation is also testimony to the incredible versatility of ‘systems thinking’ and systems design management principles.

Morrison divided ‘Systems Thinking’ into various key areas. First off was ‘Dynamic Complexity’, which evaluates reactions when a smooth-running assembly line becomes inadvertently interrupted; for example, when a supplier’s product fails and an alternative source is unavailable. According to Morrison, unexpected manufacturing events can also have a direct affect on a company’s moral and effectiveness. The reverse is also true where systems that operate smoothly can greatly improve on what Morrison refers to as the ‘Mental Model’.

Another key area is ‘Stocks and Flows’, which Morrison dubbed humorously as  ‘Bathtub Dynamics’.  Similar to balancing the water level in a bathtub with running water, systems thinking can help calibrate inflows (i.e. inventory-build up) versus outflows (i.e. sales). The depth of the bathtub is determined by a company’s internal competitive advantage. These advantages vary widely but with regards to the alignment of systems thinking with big data, Morrison focused on skills training as a key differentiator.  He highlighted his points with a case study from a US motorcycle manufacturer, Harley Davidson.

Harley Davidson Case Study
In the late ’90s, Harley Davidson implemented lean manufacturing systems throughout its operations. Management leveraged their strong union relations to encourage employee input. The response was overwhelming. After numerous meetings, participating employees elected to improve the rotor area on the shop floor. Soon new signs went up. Space allocation was optimized, and the new employee-driven initiative became a reality. Management was pleased with their progress. The improvements paid off with an increase in productivity from 70% to 94% without the need for additional floor space. All in all the project reflected a success story until a common syndrome called ‘process degradation’ set in.

Like an ambitious diet plan, the idea reached its goal only to become unsustainable thereafter. Unaddressed issues such as an understanding of who was responsible to maintain the new process wedged away the achievements. The collaborative efforts to engage and integrate the surrounding workforce were weak and gave way to a ‘do-it-yourself’, ‘if-and-when-you-can’ approach. Despite the obvious benefits, workers returned to their old habits inhibiting further progress.

Who was to blame? …management, labor, or both?

Improving productivity with limited resources is a common problem with every company. That is why CEOs leverage technology, timely intel, and training whenever and however possible. Of these three, Morrison points to training as the greatest challenge and the most commonly ignored. Even when training is available, the type of training that he recommends is not classroom-style but rather on-the-job training.

“Learning a new skill is one thing but learning how to replace one’s old habits with a new skill is quite another,” Morrison  explained. “Workers need the opportunity to ‘change their own mental model’ before the true benefits from increased productivity can be fully realized.”

According to Morrison, managers should give their workers the opportunity to learn a new system on their own terms, regardless if it requires allocating extra time during a shift or work day — even as much as 50% more time. Unless workers are given a chance to appreciate the time saving benefits on a personal level, they will more than likely return to their old habits and simply ‘add-on’ the new changes rather than adopt them for their intended benefits.

Looking ahead…
In the next few years, new skills training will involve some form of data analytics integration. As data sources swell in every part of a business, relying on a specialized team to manage the company’s data needs will become unsustainable, especially when experts tell us that big data and data analytics, done right, depend upon the seamless collaboration and exchange of data from every corner of the company. Visionary CEOs will require every employee to learn how to collect, disseminate, compare, and use data from multiple sources. Soon-to-be, ‘unsilo’ed’ departments will depend upon each other in an entirely new manner, since the data they collect will determine the value and quality of data for the rest of the company.

Just how CEOs balance this data exchange while injecting behavioral changes among their ranks will become a number one priority for years to come. …and yet will CEOs have the foresight to allow their employees to experiment with best practices on company time? As we learned from the Harley Davidson case, those leaders that do allow their employees to adopt new behavioral changes on their own terms will more than likely achieve measurable, sustainable advantages. On the other hand, those who follow the herd by, for example, hiring more data scientists to solve their data issues, may lose an unprecedented opportunity to transform their workforce. At this juncture CEOs would do better implementing a systems thinking approach today that will allow every employee to eventually become a specialized big data provider/user for the company.

© 2014 Tom Kadala

Improving the Odds of Entrepreneurial Success by taking a closer look at MIT’s Eco-System

If you were sitting at a Las Vegas gambling table with a 3% chance of winning big, would you continue to play or fold? Guessing your likely response, then let’s compare this example with launching a startup company. Statistics show that 97% of startups fail after their fifth year of operations with nearly two-thirds in their first year. If your response was to fold at the Las Vegas gambling table, then why are so many institutions encouraging students to launch a new company when the data shows that the odds are severely stacked up against them?

As though these numbers were not discouraging enough, then there are the private equity firms who search through the rubble of startups with the hopes of selecting a winner. Their expectations are even more somber. Of the thousands of business plans reviewed per year, startup investment firms will fund on average 4 deals per year, knowing all along that 3 out of the 4 companies will either fail or break-even after their first year of operations.

So, one might ask, can anything be done to improve the odds of success for a typical startup?

Lab to Market
At universities the term ‘lab to market’ is used to describe the worn path that many young companies must endure to become successful. Their humble beginnings tell a familiar story where an unexpected mishap in a lab inadvertently inspired their startup. For some, the inspiration came from a personal experience, such as in the case of DropBox’s founder, Drew Houston, who got tired of using USB drives to move files from one computer to another. Had Drew not been inconvenienced enough times, DropBox may have evolved differently or not at all. The key to his success was not just his personal revelation and commitment, but also MIT’s established eco-system that was there when needed to grow his nascent idea into a global company. MIT’s contribution was so crucial that one might ask, if every entrepreneur had access to a similar eco-system as MIT offers, would the odds of success improve? Surprisingly, the answer is ‘not necessarily’.

Ideation
Just as moving ideas from lab to market are challenging, coming up with the ideas in the first place or ‘ideation’ requires an entirely new approach and discipline, one that MIT addresses today with the first-of-its-kind ‘proof-of-concept’ center known as the Deshpande Center for Technological Innovation – http://deshpande.mit.edu/.

Recently, I attended an awards reception to honor the 2013 winning teams who were approved for nearly $1m in grants. That evening the lobby of the Media Lab (where the event was held at MIT) was buzzing with sponsors, investors, students, faculty and other interested parties. Hoping to be discovered, the teams were on hand to display their progress and answer questions. Unlike a traditional startup competition that select the best business plans, this event focused on the teams with the best business ideas. Appreciating the difference between both ideas and plans is key. Ideation occurs primarily at the very beginning of the entrepreneurial process, while business plans that build upon proven ideas come later.

When Drew Houston stumbled upon his vision, DropBox was just an idea, an idea that could have easily slipped out of his mind had it not been for a timely injection of funds to nudge him along to help him prove his concept further. That nudge, that tap, that light push made all the difference. The timely urgency to nurture ideation at this very initial point in the entrepreneurial process was what inspired Gururaj “Desh” Deshpande and his wife, Jaishree, to donate $17.5 million to launch the Deshpande Center at MIT.

An Innovative Approach
At the reception I caught up with the founder, Desh, and asked him if he was pleased with the Center’s 10-year record of 110 funded projects with 28 successfully spun out companies. A successful entrepreneur himself, Desh seemed less interested in speaking about his Center’s extraordinary achievements than he was of the impact his Center had among the faculty and graduate students at MIT. To him the true value proposition of the Deshpande Center was less about granting awards to a select few and more on the number of applicants who applied. He felt that the Center’s application process forced researchers to view their work from an ‘idea to impact’ perspective, an approach, he felt, was uncommon among researchers. With his contagious smile, Desh boasted that it was not unusual for non-winning applicants to apply a second or third time.  Last year two such teams that despite not winning a grant from the Center, succeeded in launching their startups anyway. With a deep sense of pride, Desh relished the fact that his Center’s influence had achieved an equally positive impact with every applicant, regardless of who won a grant or not. Through his Center, Desh had created an ‘ideation culture’, one that is often ignored and yet intimately critical to the success of any startup/eco-system.

Surely the odds of entrepreneurial success should improve if more startups had access to established eco-systems, especially those that support ideation early on. But perhaps the lesson to be learned from MIT’s Deshpande Center’s story is less about funding ideation grants and more about giving entrepreneurs a second or even a third chance to prove their concept. Just think how many fantastic ideas are tossed aside and lost forever simply because a business or grant contest is designed to select only three winners?  …or the thousands of business plans tossed in the garbage of an overwhelmed angel investor? …or the business plans that are rejected because of an entrepreneur’s poor presentation skills? Imagine what would happen if one-quarter of the startups presented to a private equity investor were randomly awarded a Deshpande Center-like financial nudge for further proof of concept. Maybe then the odds of succeeding as an entrepreneur would truly improve.

© 2013 Tom Kadala

Disrupting the Banking Industry with Big Data and Data Analytics

Bankers seen sipping away the hours over client martini lunches at upscale restaurants and posh clubs are rare these days. The slump in credit demand from the global economic crisis is part to blame but so to is the absence of ‘live’ clients. Branch offices that were once community hangouts on payday look more like empty office spaces for lease. Today bank clients ‘hangout’ virtually, while doing most of their banking online. They lurk in and out web-based services unwittingly leaving behind hundreds of data points (like footprints) that when reconstructed using data analytics algorithms can accurately reveal the client’s real identity. 

At a first-of-its-kind event in Atlanta,Georgia titled, Customer Insights & Analytics in Banking Summit 2013, representatives from various forward-thinking banks and data-analytics service companies presented their combined views to a packed room of financial professionals. Organized by Data-Driven Business (datadrivenbiz.com), a US arm of FC Business Intelligence (a London-based events company), the Summit personified the past, present, and future of banking.  First, it exposed the ugly truths characteristic of a complacent banking culture mindset.  Then it highlighted the extraordinary accomplishments from early-adaptor banks, and, finally, it unveiled a fantastic prediction on how banking could potentially hold the keys to unlocking the value of social media feeds from Twitter, Facebook, and other similar web-based services.

With off-the-shelf, data analytics, software tools, bankers can gain an accurate 360 degree view of their customers on an individual basis just by matching a customer’s banking data (i.e. loans, credit card purchases, investments) with their behavioral patterns online. The technology used to integrate data sets to match behaviors with individual names has advanced remarkably, so much so, that bankers can calculate with reasonable accuracy the ‘lifetime value’ of each customer. This magical step has been demystified by over 150 vendors who specialize in the science of Digital Data Integration or DDI. DDI connects numerous disparate data sets both structured and unstructured using assigned ID numbers. Expert companies in this area include Aster (asterdata.com, a TeraData Company), Actian (actian.com), PrecisionDemand (precisiondemand.com), Convergence Consulting Group (convergenceconsultinggroup.com) and Actuate (actuate.com, a BIRT company). The principle reason bankers want to segment their customers by their future income potential is to allocate their limited resources more efficiently.

Banks that fully integrate their operational data with unstructured social media streams will become the game-changers to watch. Already the Old Florida National Bank boasts of their younger and more agile management team (under 43 years of age) who credit their surging asset growth in the past four years to their data analytics initiatives – (from USD$100m to $1.4b). Their team has the proper bank culture, mindset, and know-how to implement data analytics tools that fully capture a digitally-holistic view of their customers. By mapping where their customers spend most of their time and money, management can target more relevant and timely offerings. Targeted customers unwittingly respond with not only a buying interest but also a willingness to refer the bank to a friend or colleague. …truly a win-win for all.

SunTrust Bank, also based in Florida, uses data analytics to determine not only the location of their next branch office, but also the optimal management qualifications required to operate one of their branches. Another interesting case study came from Wells Fargo. Their data analytics team integrates thirty-two data sets (from both internal and external sources) and presents the results in a customized dashboard format to their managers company-wide. Managers use the service to make better decisions, present data on an ad-hoc basis at meetings, and self-serve their specific research interests using a number of additional data visualization tools for non-techies. The tools they use are off-the-shelf Business Intelligence or BI software packages provided by companies such as Oracle (oracle.com/BI), MicroStrategy (microstrategy.com) and Tableau (tableausoftware.com).

Servicing a more digital client-base has come with its many challenges as well as with its unexpected opportunities. For example, credit bureaus that traditionally deny 96% of consumer credit requests often reject qualified candidates. Using data analytics tools, however, banks can integrate comparative behavioral data with a candidate’s payment history and reassess their risk profile accordingly. The results would qualify more loans that would otherwise have been turned down. Other exciting ways for banks to grow revenues include working with real estate brokers. Banks can determine which of their clients is most prone to purchase a new home and pass the list on to an agent. Agents seeking better leads will more than likely recommend mortgage business back to the bank that shared their intel.

One can just imagine how many more ways bank data can play an integral part in helping companies find their most likely customers and future business. Banks already manage the transactional data in-house and are rapidly gaining the business intelligence experience needed to integrate their customer’s behavioral data and compare their profile with their peers. Under this scenario, one might wonder why any business would not want to work with a bank that not only understands their business but also delivers buying customers.

With this much real-time intel available on customers in one central location, could banks one day become the primary lead source for their business clients? Could this new normal become a significant game-changer in the banking industry?

Despite a rosy future, the business world is not waiting for banks to embrace data analytics any time soon.  Competitive trends point to a number of threats including retailers such as WalMart who will be offering banking services directly to their customers at their retail outlets.

There is also the emergence of the ‘digital wallet’, which for the time being focuses on reducing the clutter of credit cards using available smartphone technology. Eventually one company will umbrella all credit card transactions and offer global behavioral tracking intel. Pioneers on the forefront include Protean Payments (getprotean.com), a recent startup that plans to use bluetooth technology to replace card swiping at  terminals and Wallaby (https://walla.by), a company that helps cardholders maximize points earned prior to making a purchase.  There’s also Ebay’s PayPal (paypal.com), which has released a debit card concept, which it hopes will entice developers worldwide to promote their data analytics services to SMEs.

In online banking, Simple.com does not have a physical presence nor charges the customary fees that traditional banks do. In fact, they offer plenty of financial management reports and suggestions at no charge. …all online, of course. How they make money is best understood when opening an account. Simple.com new accounts cannot be opened unless one is willing to accept ‘cookies’ on their computer, a permission which releases away a user’s complete web history to a third party. Their insistence suggests that they place a greater value in a customer’s behavioral online data than they do in their banking business.

If Simple.com succeeds, could their new business model significantly change the way consumers perceive a bank’s value proposition?  Will consumers demand additional compensation for allowing access to their behavioral online data, since the data is worth more than the interest paid on deposits?

For now, banks who are looking at data analytics for the first time and wondering how and when to take the plunge should heed practical advice from experts who spoke at the event. One individual concluded that for now, those new to data analytics should start with the data they already have and use predictive findings from data analytic tools to start a conversation rather than formulate targeted recommendations.  This advice and the rapidly evolving changes in both consumer and commercial banking remind me of the famous Aesop’s Fable about the race between the tortoise and the hare. This time, however, the winner may be a third and invisible participant called ‘Big Foot’ representing Big Data and Data Analytics.

© 2013 Tom Kadala

Could PayPal become the Global Reserve for Cash and Data?

As PayPal continues to reinvent itself, expect the mother of all disruptions, a global currency comprised of cash and data. Similar to how voice and data coexist over the same copper wire today, PayPal’s next move will co-mingle cash and data over a shared platform. Instead of bits to sound bytes, however, PayPal hopes to seamlessly integrate customer and peer data (in the cloud, of course) and deliver customized business intelligence across multiple platforms to small business merchants all over the world — right when they need it most. There is one catch. Every merchant transaction including credit cards would have to involve PayPal.

Last July at a PayPal sponsored ‘Battle Hackathon’ event, which took place at AlleyNYC (alleynyc.com) near Times Square, over 100 local software developers worked through the night in small groups to create a new ‘killer app’ of their choice for a chance to win a $100,000 grand prize. This event was one of ten stops along PayPal’s world tour, which included Barcelona, Berlin, Moscow, Seattle, and Tel Aviv. Throughout the night, PayPal’s minions were on-hand to help developers integrate a list of special access APIs (Application Programming Interface) into their code. These APIs offer developers controlled access to PayPal’s databases. Aside from identifying worthy programmers for hire, PayPal uses these Hackathons for feedback on their growing library of APIs. While attending, I caught up with their Global Director for their developer network, John Lunn.

A former marine biologist who compares PayPal’s membership behavior to schools of fish, Lunn shared some eye-popping statistics from PayPal’s extensive databank.

  • 65% of items purchased in a retail store have been researched prior on the Internet.
  • 43% of browsers at a retail store actually make a purchase.
  • 37% of shoppers who price compare in the aisles using a smartphone App, complete their purchase online later,
  • On average 15 year-olds will remain on a retailer’s web page for less than 6 seconds.

“You have to be where your customers frequent”, claimed Lunn who strongly believes that the future of the web is with mobile devices, especially since market near-term predictions for mobile payments are upwards of $20b. Already a prominent player, PayPal expects to process $7 billion in mobile payments next year, which is 10 times more than its volume two years ago.

The increased payment activity has PayPal eyeing the customer-specific,  behavioral/buying-preference intel that can be extracted from the transactional data. Rich in details, this harnessed data could become a game-changer for small retailers.

“Without data, you actually know nothing about the consumer,” Lunn exclaimed. Conversely, with data, a merchant can react or address a customer’s wants and needs at a lower cost. Showing a customer what they will most likely purchase based on their personal profile and peer comparisons can make every aspect of running a business immensely easier and efficient. From marketing, sales, inventory control, retail space, and employees on the floor — every improvement that is based off better business intelligence derived from rigorous data analytics and self-teaching algorithms will have a lasting impact on the rest of the business as well as for its corresponding supply chain.

A customer’s buying experience is important too…  

“Buyers no longer want to wait in line,” Lunn notes. …and why should they if technology can enable them to simultaneously step up to the same checkout counter. Lunn used Jambo Juice as an example of how PayPal card holders can order up their favorite drinks from their mobile device and use face recognition to verify their purchase in the store. There’s no waiting around since drinks are prepped in time to be picked up. Watching a worker cut up vegetables and blend a customer’s health drink was once perceived as fresh and worth the wait. Not anymore. Consumers value their time as much as they do the products they buy.

With buyers who are far more knowledgeable of products than ever before, the only line of defense available to merchants is a deeper understanding of their customer’s buying habits. But knowing what a customer purchases in one store is not enough to make a difference. Merchants need access to richer and timely intel about their customers and their peers, not just what they bought recently from them, but elsewhere too, with other merchants, on or offline, locally and globally.  With access to this much data, merchants could target their best customers and provide them with exceptional service especially during the few minutes a customer spends at the check-out counter. For example, once a face is recognized at the register or an account number entered, hundreds of points of data could be co-mingled, correlated, then calculated instantly between PayPal and the merchants database to extract a customized product recommendation such as a special offer or custom-printed coupon booklet. Each timely recommendation would help build a stronger bond with the store’s brand.

Integrating into a merchants database or CRM system requires an army of developers. PayPal knows this fact and hopes that its easy-to-use APIs will encourage developers to include PayPal with their client’s transaction processing needs. PayPal’s inclusion would do away with the ‘clunky terminals and expensive equipment’ many merchants use today to process credit card payments. However, to make PayPal’s ambitious business intel plan really work, every merchant on the planet would have to become a PayPal member.

Could PayPal become the Global Reserve for cash and data?

To appreciate PayPal’s shrewd and brilliant strategy, pick up a copy of a fascinating book titled, “The PayPal Wars” by Eric M. Jackson. The author explains in compelling, narrative detail how the simple idea of helping world economies through job creation, prosperity, and world peace is hinged upon merchants trading freely and seamlessly across borders. If merchants in The Congo, for example, could sell their goods as easily to a local buyer as they would to a buyer in New Zealand, their improved cash flow would help strengthen their local economies and grow their businesses.

PayPal’s past success was predicated on the individual support of its very members. When eBay tried to replace them with an in-house solution called BillPoints, PayPal’s members rebelled. …and after many other similar competitive encounters, members could indirectly claim a personal stake in PayPal’s ongoing success. Their formidable presence overwhelmed even their craftiest challengers. Time will tell if PayPal’s loyal customers will once again help them forge on with their ambitious quest to become the Global Reserve for cash and data.

© 2013 Tom Kadala

To Byte or Not to Bite: The Myths, Realities, and Trends behind the Science of Big Data Analytics

Without data, a company would never survive in today’s global environment. With some data, it might have a fighting chance, depending upon the quality and timing of the information.  But what happens when a company has access to too much data, sometimes referred to as ‘Big Data’? Ironically, it too could go out of business even with the best technology and staff to manage it.  Why? …partly because the data’s ultimate value depends upon who interprets and communicates the recommendations to the rest of the company, a task often left to an internal employee or ‘Data Scientist’ who may be no more than a recent university graduate armed with theories and little industry practice.  

According to Dr. Jesse Harriot, the Chief Analytics Officer at Constant Contact and author of “Win with Advanced Business Analytics”, “setting up a data analytics initiative within a corporation is not a trivial endeavor”.  It requires a lot of sponsorship at the corporate level and can take a year or two before achieving a meaningful balance between the influx of web data and its collective value to the company. Harriot shared his wisdom at a recent conference in Boston titled, The Science of Marketing: Using Data & Analytics for Winning”. This power event organized by MITX, a Boston-based, non-profit trade association for the digital marketing and Internet business industry – (mitx.org), served up an impressive venue of expert panelists who shared their best practices and experiences.

Among them was a star performer, care.com, the largest online directory that connects those in need of care with care providers. Their co-founder and Chief Technology Officer, Dave Krupinski, discussed how the company uses analytics to drive all aspects of their marketing function including, attribution analyses, customer segmentation, user experience, and predictive analyses. As Krupinski explained to a packed room of 300+ professionals, “most CEOs blindly jump into ‘big data’ analytics expecting immediate returns, only to discover (and after great expense) the many intricacies required to get it right.”

Is ‘big data’ analytics really worth the trouble?

If economic times were healthier then maybe not, but with a slowing economy, companies are forced to either come up with the next differentiating product/service that will give them an extra edge over their competition or figure out better ways to surgically target likely buyers based on real-time data. But, increasingly, fickle-minded consumers whose loyalties remain largely unpredictable have made the task exceptionally challenging. …and yet, no one can blame consumers for their lack of brand loyalty when on average they are bombarded with over 500 ad messages per day.

A Typical Corporate Scenario
In a mocked up example for discussion purposes, a typical CEO hires a ‘Data Scientist’ or promotes someone from IT, after reading positive reports from companies that have boosted their sales using ‘big data’ analytics. Once budgets are allocated and a team is in place, software with funny names such as Hadoop, MapReduce, and HAWQ appear. These packages digest massive data sets (mostly unstructured data from the web) and respond quickly to complex SQL queries entered by a team operator or analyst. The output is then parsed into a more visual friendly format perhaps using expensive Business Intelligence (BI) software and when ready, shared at weekly management meetings. For this example, the meeting is adjourned without much warning. Management felt that the results from the Big Data Analytics Team were not aligned with corporate priorities, a common problem that points part of the blame on the Data Scientist’s poor understanding of managements business needs and on the CEO for not creating a comprehensive, formal data governance.

Disappointed CEOs tend to view ‘big data’ analytics as a ‘think-tank’ style department that delivers flawless dictates to the rest of the company, when in fact, ‘big data’ analytics should be a collaborative data-sharing effort among all departments.The secret of getting ‘big data’ analytics to work is less about massaging structured and unstructured data quickly behind closed doors and more about the timely reintegration of field data from every department to continually tweak predictions and outcomes.

What should a CEO do to encourage data sharing among departments?

Most department heads do not share their data with their cohorts either by choice or due to incompatibility issues.  To address this reluctance, a CEO should first explore a standardized database structure and data exchange format that would allow departments to share their data seamlessly. Next he or she should develop an incentive plan to encourage staff members to not only share their data but request data from others. The fewer restrictions imposed on inter-departmental data exchanges, the more likely, new ideas will blossom. Moreover, the positive behavioral changes in the workforce will help the data analytics team stay focused on corporate priorities. Keeping internal operations lubricated with both internal and external data analytics will boost a company’s revenues by default. This approach can lead to a passive revenue strategy that focuses more on balancing an operation guided by ‘big data’ analytics than relying on traditional consulting advice or CEO hunches.

A Five Stage Journey
I turned to a visiting professor at the Harvard Business School, Tom Davenport, to categorize the ‘big data’ analytics journey a CEO can expect to take. Davenport listed five progressive stages needed to achieve ‘big data’ competence in today’s business environment. First, there are the ‘Analytically Impaired Companies‘. These are companies that have some customer data but lack a centralized strategy to leverage its use.  Next up are the ‘Localized Analytics’. These entities outsource their data needs to companies that follow traditional marketing practices. Then come the ‘Analytical Aspiration’ types who centralize their data sources, enjoy C-level support, and operate an in-house data analytics team. At this level companies are just beginning to grapple with their ‘big data’ analytics issues. A fourth phase has been designated to ‘Analytical Companies’ who are showing some success in using data to drive their business. Finally, and at the top of the heap are the ‘Analytical Competitors’. These companies have fully integrated proven algorithms that combine unstructured web data, with reintegrate field data to seamlessly predict a specific customers expected wants and desires based on their personal past history with the company and elsewhere including the same for their closest peer group.

Most daunting to any CEO is the notion that companies ranked at Davenport’s ‘Analytical Competitors’ level can rely almost entirely on their algorithms to run their business. The indisputable outcomes dictate their level of ad spend per quarter, allocation of ads across multi-platforms, inventory levels per SKU, quality of maintenance support, head count, and so much more. At some point one might even ask what the role of management should be for a company ranked ‘Analytical Competitor’ and the talent/expertise needed to be an effective CEO in this soon-to-be, new normal.

© 2013 Tom Kadala

Data Mining Lessons for Obama

Earlier this month an ex-CIA employee and whistleblower, Edward Snowden, exposed the federal government’s 6-year old, clandestine initiative, referred to internally as PRISM, a covert data-gathering program that began in 2007 as a corollary to the Patriot Act of 2001. This White House-directed, domestic-espionage project has been collecting phone logs of millions of U.S. citizens from major telecommunication giants (e.g., Verizon, AT&T, and Sprint) and emails from nine prominent Internet companies (e.g., Google, Yahoo, Apple, Microsoft) in a concerted effort to thwart future terrorist attacks.  

History shows that PRISM has prevented numerous incidences, including a foiled backpack bombing plot in New York in 2009. Despite its undisputed success record, PRISM has ignited a national debate on whether the administration has gone too far seeking tighter security at the expense of civil liberties. In a statement to the American people, President Obama argues that his actions are justified.

 “You can’t have 100-percent security and also then have 100-percent privacy and zero inconvenience,” he famously stated. “We’re going to have to make some choices as a society.”

Not surprising, many Americans disagree with Obama’s position and have taken action. Among them is Sen. Rand Paul (R-Ky), who will soon introduce a class action suit of which he hopes to obtain more than 10 million signatures. Those in favor of Obama’s PRISM believe that the price to pay for security is small in comparison. Just how damaging can a diverted phone log be to anyone, or a random email read, for that matter, if terrorist attacks can be prevented? However, when the process requires canvassing mountains of data that could randomly incriminate anyone, the fundamental basis for the U.S. judicial system where defendants are considered innocent until proven guilty, is truly at risk.

‘Con’ concerns do not stop there. Dissenters argue that PRISM has set a precedent for further erosion of individual freedom. Without a counter mechanism in place, future leaders will more than likely continue to up the ante on domestic surveillance until an unimaginable, automated version of a Russian-style KGB informant process becomes undetectable and virtually unstoppable. If you are skeptical, consider what happened with consumer debt after Reagan’s supply-side economics took hold: Every American consumer was doused with credit cards. The combination of economic bubbles that followed eroded the effectiveness of our elected leaders in Washington who today are trusted by less than 20 percent of the population.

If eliminating PRISM is not an option, then what mechanisms can be put into place, early on, to prevent domestic surveillance from reducing our individual freedom… and what solutions have worked in the past, and with whom?

Data Mining vs Mining Precious Metals
Lessons can be learned from another type of mining activity that is very similar to mining data, namely, mining for precious metals in some of the most remote areas on the planet. Surprisingly, the operational principles of the two efforts are nearly identical. In both cases expensive machinery and sophisticated software are used to sieve through enormous amounts of data/ore. Both identify specific assets (i.e., key leads/gold nuggets) that in aggregate could create exceptional value, a value so great that individuals, corporations, or governments would break laws or silence whistleblowers to secure its use or acquisition. Finally, both processes are confronted with a conflicting tradeoff that involves the invasion of privacy of a constituency of voters.

Just as Americans feel an attack on their personal freedom from PRISM’s data mining activities, local communities in Peru, Congo, Guatemala, South Africa to name of few, experience a similar personal upheaval when global mining companies (i.e. Barrick Gold, Rio Tinto and many, many more) set up operations without the communities’ consultation or consent. Environmental disasters, such as toxic chemicals found in the water supply or increasing numbers of birth deaths or defects, have exposed rogue mining operations and over time have forced the hand of powerful politicians and legislatures to comply with legal mechanisms that protect the rights of affected community members.

Recent examples include Peru’s mining town of Bagua where 34 people were killed in 2008 in a staged military attack against peaceful indigenous demonstrators. In the Congo, where many rare-earth minerals are used to make mobile phones and appliances, increasing local uprisings have forced mining operations valued at $1 billion to close.  These uprising are clear evidence of a failed system or policy. They offer a lesson and illustration of a similar dark future for Obama’s PRISM project, if left uncontested.

The striking resemblance between data mining and traditional mining suggests that some of the best practices used to resolve conflicts in the mining industry could also be applied to the PRISM project to safeguard it from escalating and potentially causing a ‘trust rift’ between the US government and the American people.

American Society/Council of the Americas
At a recent gathering of the American Society/Council of the Americas (AS/COA) in New York City, a distinguished expert-panel with deep field experiences working on some of the toughest mining-related conflicts in the world offered their insights, best practices and ongoing recommendations to a packed audience of interested parties of non-profits, NGOs, and private investors. (AS/COAs recent issue of, Americas Quarterly, covers additional details.)

To qualify as a best practice, the panel highlighted a simple yet fundamental metric that involves a transparent two-way conversation between the mining company’s project (consultation) and the local communities concerns(consent). Social unrest is almost inevitable when the conversation becomes opaque and one-way or as referred to by the industry, “consultation without consent”. Fortunately, legislative progress continues in countries like Peru where mining laws have been passed that require both consultation and consent, for example, in cases where a community is forced to move.

One of the expert panelists, Rachel Davis, the managing director at ShiftProject.org, highlighted the imperative need to include consent mechanisms. To this end she outlined three key challenges that mining companies must address properly to ensure an open-dialogue with an impacted, local community.

  1. Offer a venue for consultation but be prepared to spend at least one month of face time to earn the people’s trust.  “Trust,” she emphasized, “is the imperative currency for collaboration.”
  2. Train staff members within the mining company to develop a genuinely concerned attitude along with the skills to handle awkward conversations or even hostile responses.
  3. Ensure available access within the company to handle grievances and capacity-building, coordination efforts within a cross-functional, corporate structure.

Emily Greenspan, the Senior Policy Advisor at Oxfam America, tweaked Davis’ three points by adding one more important stipulation. She recommended that mining companies evaluate how decisions are made at local levels.

“Taking the time to understand the culture, temperament, timing requirements, and so much more are crucial from the out start,” Greenspan explained. 

Her comments reminded me of President Obama’s lunch engagements with members of Congress earlier this year. They were, in my opinion, too little, too late to have the desired effect. Had Obama requested these luncheons at the beginning of his first term, perhaps the paralyzing partisan gridlock that we have today would have found common ground. The lesson learned is the vital importance of getting to know your audience from the beginning; otherwise the cost of catching up becomes prohibitive and meaningless.

With the looming ‘black cloud’ surrounding the PRISM Project, Obama would do well to learn from his prior experiences and heed the advice from field experts, many of whom are already within his administration and working on  global mining issues for precious metals. Why not tap on their wealth of experience to help clean up the PRISM project mess?

As history has shown in the mining industry, time may be running out for Obama. A failed policy of this magnitude could turn into an irreversible tide of social unrest.

© 2013 Tom Kadala

Will Sustainability become the Feared Equalizer?

Why is the price of oil still hovering around $100 per barrel, if global demand has fallen and the supply of alternative energy sources, including shale and renewables, are increasing? Could it be that commodity traders are reacting to a new series of less visible market forces? 

We know that whenever Iran talks up their nuclear energy aspirations or Israel fires missiles into Syria, oil prices tend to rise or as of late, not drop by much. There is also US Congress’ lack of a comprehensive long term energy policy that has kept a tight rein on infrastructure investments such as charging stations for electric vehicles. However, as I discovered recently, there is yet another force at play, one that is far more complex than society is prepared to confront today and which will surely cause the price of oil and similar fossil fuels to double, if not triple in price, in the coming decades. This invisible force is referred to as sustainability.

What exactly is sustainability? In simple terms, sustainability is about replacing a resource so it can be used again and again. Terms like ‘recycling’ trash or producing ‘renewable energy’ are commonly associated with the practice of sustainability or the act of sustaining an activity in perpetuity with minimal environmental damage. Perhaps the best example of sustainability are e-books because they never wear out from one user to another and can be reproduced millions of times from one stored copy. Nevertheless, sustainability is more than just a repeatable process. It is also a culture, an attitude, a way of thinking that inspires inherent behavioral changes on socially-acceptable consumption practices.

MIT’s Sustainability Summit
At MIT’s Sustainability Summit last month, I came away with a deeper appreciation for what sustainability can mean to different people, especially how it can motivate them to change their habits and the habits of others, and yet, I could not help feel discouraged by the global indifference and the immense size of the problem. What set me over the edge was a powerful video called, ‘The Art & Science of Chasing Ice’ produced by James Balog on how our north and south polar ice caps are melting away from the amount of black soot dispersed into the atmosphere from our factories and automobiles. If this visual does not do if for you then perhaps a TED video by Charles Moore on the Great Pacific Garbage Patch may bring it home. The visuals are truly stunning, rude awakenings of what a planet with 7 billion individuals are capable of doing wrong.

With the UN’s projected 9.1 billion people by 2050, one can be absolutely certain that issues of sustainability will be front and center in the daily livelihood of every individual and entity. Why? …for the simple reason that our planet resources are limited and our current lifestyles and diverse cultures have yet to align and adapt to a sustainably-friendly behavior.

After attending the MIT Summit, I concluded that the efforts to align sustainable priorities are not only a discombobulated entanglement of disparate, self-appointed initiatives but also an odd assortment of potentially conflicting outcomes. To get an idea,  take a look at two opposing car ownership attitudes by city dwellers.  While the new normal has shifted favorably to shared auto usage among urbanites in developed countries (i.e. US – zipcar.com), in emerging countries (i.e. Brazil, China), new consumers expect to own their own car as soon as they move into a city!

Walmart vs WholeFoods
Another similar example of conflicting outcomes was visible at The Atlantic Magazine press conference in Washington DC on December 4, 2012. A forum of experts showcased the sustainability policies of two retail food companies, Walmart and WholeFoods.  While both companies work closely with their suppliers to recycle waste and introduce biodegradable packaging, Walmart’s Beth Keck, Senior Director of Sustainability, explained that Walmart provides their tight-fisted consumers with environmentally friendly products and chooses not to educate them on how they should change their consumption attitudes toward a more wholesome sustainable lifestyle.

In curious contrast, WholeFoods’ counterpart, Kathy Loftus, Global Leader, Sustainable Engineering & Energy Management, stated that with one-tenth the number of retail outlets as Walmart, WholeFoods is deeply committed to educating its employees and the communities they serve. The company teaches sustainability as a shared problem that begins with each and every consumer. WholeFoods believes that the improved knowledge on how one’s food is handled and prepared can help consumers make better choices and therefore lead healthier lives that will result in fewer medical issues. The money saved from fewer doctor’s visits and drugs, for instance, could justify WholeFood’s higher prices, …which explains in part why Walmart with its cadre of low-priced, branded, processed food suppliers has avoided engaging directly with their consumers.

Will the term ‘sustainability’ just become another commonly used marketing term such as ‘green’, ‘organic’, and ‘hormone-free’ that companies can push at will to meet their own corporate business agendas?  …maybe not this time.

A Key Driver – Shareholders
Fortunately the investment community is making meaningful strides with shareholders and CEOs. According to Sustainalytics, a Boston-based firm, companies are eager to disclose their annual ESG scores (Environmental Social and Governance), a metric used to measure best practices.  A total of 3,600 corporations globally have signed on since 1992, but as Annie White, their Research Products Manager noted, they have only scratched the surface with over 40,000 public companies still remaining.

Driving the increasing interest for ESG scores are concerned shareholders who fear that unmanaged risks or ‘blind spots’ could unexpectedly pull a global company down to its knees as has happened with BP’s Gulf oil spill of 2006, Foxconn’s child labor practice that affected Apple earlier this year and the five garment factories for European and American branded clothing that collapsed in Bangladesh this month. With good reason, shareholders are concerned that similar disasters will become more commonplace and that reactionary foreign government retaliation could put them out of business.

According to Katie Grace, a Program Manager involved with the ‘Initiative for Responsible Investing’ at the Harvard Kennedy School, local governments do not have to wait for a catastrophe to legislate changes but rather can take a proactive role by setting project specific policies. Regionally, for example, they can rezone areas to attract private sector investments. They can also set standards such as LEED, which is used for certifying eco-buildings. For social projects, governments can issue ‘green bonds’ or payment guarantees for investment funds (i.e. Social Impact Bonds).  Some mayors like Philadelphia’s Michael Nutter have adopted these proactive recommendations with their sustainability efforts and are starting to see positive results.

The City of Philadelphia
Katherine Gajewski, Philadelphia’s Sustainability Director, a new position also held at over 115 municipalities across the US, spoke of her challenges working within an entrenched bureaucracy of over 22,000 public employees, most of whom are reluctant to change. Her reprieve has been her frequent conference calls with her 115 peers who openly share their best and worst practices. Their collective list of ideas has grown as the group continues to innovate together, while making most of their ideas up as they go along.

Some interesting cases that have already crossed Gajewski’s desk might surprise you. For example, an Enterprise Car Rental operation in an industrial section of Philadelphia was paying $400 per month for their water bill but was costing the City millions of dollars to purify their share of dirty runoff from their car lots. Eventually, the situation was rectified but not until Gajewski ran the numbers to show the disproportionality between what Enterprise was paying for their office water usage and the cost to clean up its runoff.

Just how many other industrial installations are out there in a typical city like Philadelphia where a company unwittingly gets away with paying a small fee to use a common service but whose operations account for a substantial cost of clean up? …probably a lot!

Gajewski’s job as a Sustainability Director requires more people skills than know-how. She must craft alignments of interest among internal groups to achieve meaningful consensus. Perhaps most important, her role as director and facilitator is to refrain from becoming too preachy and be willing to dole out credit to each participant. Easier said than done, Gajewksi knows that sustainability is a shared task that succeeds when everyone is on board.

As more Sustainability Directors like Gajewski identify similar imbalances in their respective cities, the idea of charging the same consumer for both usage and their share of the cost of cleanup will become more widely accepted. …and herein lies the reason why fossil fuel prices will continue to rise for years to come.

###

Below is a summary of Best Practices that were shared during MIT’s Sustainability Summit.

Best Practices

  1. At a university-run, trash audit, MIT students sieved through a months worth of the university’s garbage to discover that of the 2.5 tons of trash collected, 500 pounds was food waste while the remaining 90% could be recycled! The visual impact of over $11.4 billion of trash that could be recycled in the US alone inspired one student to launch a 30-day waste challenge on https://www.facebook.com/30DayWasteChallenge where Facebook friends could commit to ‘be inconvenienced by their trash’ by carrying the trash they personally generate throughout their day for a 30-day period.
  2. Offering consumers a list of prices for the same product but packaged with different levels of biodegradable materials would help bring to light the importance of recycling.
  3. Wirelessly integrating a soda vending machine with a recycle bin located nearby could encourage consumers to recycle their containers.  Consumers would pay, say two dollars and fifty cents, for a soda and receive a one-dollar refund on their university credit card once the soda can was disposed of in the appropriate recycle bin within the allotted time.
  4. ‘Rewire’ individuals at opportune times so their behavioral changes continue well after a recycling program or contest. For example, students can be impacted for behavioral change during a time of transition such as the beginning of a semester.  Recycling contest rules would be established at the start of the semester and monitored throughout the year.
  5. The crop of graduating students who enter the workforce concerned about sustainability issues will inspire a new set of hiring qualifications. Already companies like WholeFoods have changed their hiring criteria to reflect their corporate goals for sustainability.
  6. Teaching children in lower school to become advocates for a sustainable future is the most effective use of funds for behavioral change. Not only will these youngsters represent the future of our planet but their unbound audacity to correct adults who forget to recycle would deliver a priceless message with an impactful and lasting effect.
  7. A practical solution launched this year in California involves a utility tax on a consumer’s bill that is merely collected by the utility company and paid directly into a Global Educational Fund for educational initiatives. The tax removes the utility’s burden of financing similar programs for its sector and uses the utilities billing capacity as a pass-through.
  8. WholeFoods spends time in Washington DC convincing lawmakers that refrigeration codes need upgrading.  Currently stores are allowed to have open refrigeration, which according to a WholeFoods spokesperson, Kathy Loftus, spends considerably more energy than if the same refrigerator had a door.  Another sustainable tip from WholeFoods is the wider use of ships to transport goods rather than trucks. According to Loftus, ships have a lesser impact on the environment than trucks.

© 2013 Tom Kadala