Will your next Best Friend be a BOT?

With 5G phones on the horizon, state-of-the-art voice technology is being positioned to become the next wave of AI-driven devices that will significantly change every aspect of our personal and professional lives. These disruptive devices will provide us on-demand, relevant suggestions and viable step-by-step pathways to solving complex problems. In fact the changes in store both technologically and behaviorally at home and the workplace will be so extreme that the laptop or mobile phone you are using to read this article are destined to become another glorified door stopper. Here’s what you can expect.

Googling as we know it today will change from a one line entry to an instantaneous voice exchange between you and a Bot. Rather than delivering a list of links, the output your Bot fires back will be specific to your questions. Downloading Apps on to your smartphone will no longer be necessary because your Bots AI will scour every major App available and connect you with the one that will best meet your needs. Not only will your Bot deliver optimal answers but also ask you insightful questions for you to consider. Username and Password requirements will no longer be necessary. Instead, the voice print you use to speak into a device will vet you seamlessly. HTML-driven web page searching will soon appear primitive and inefficient. Even the ‘water cooler’ chat among peers for pollenating ideas will become a distant memory.

Due to the FOMO effect, (Fear of Missing Out) I decided to attend the Conversational Interaction Conference in San Jose, California this year. The two-day, annual event held in early February provided me with valuable industry insights, updates on working applications, future blue prints, new industry buzz words, plus a list of key hurdles confronting voice technology developers today. Below is a brief synopsis of what I saw and heard at the conference. I have included links and capitalized key industry buzz words for easy identification. After reading this article and accessing the links provided, you should have gained sufficient intel/understanding to ask better questions on the current state and future of voice technology.

Voice Controlled Devices

Let us start with some familiar Voice Controlled Devices that are already in the marketplace such as Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana or Google’s Home. These ubiquitous smart devices have allowed us to seamlessly use our natural voice to make on-demand requests, such as to play a song, the news, the weather, or a lot more. For the most part these devices do a decent job of what is referred to in the industry as “Command and Control” applications. On the backend a user’s command such as “Alexa! What is the weather?” is converted from speech to text using Speech Recognition (SR) then passed on to a Natural Language Processor or (NLP), which invokes Deep Learning (DL) and Neural Network (NN) algorithms to analyze and prioritize keywords. Backed by powerful cloud-based computing power, multiple algorithms use these keyword inputs to generate a text response. This text is then converted to speech using Text-To-Speech (TTS) and delivered back to the device. This entire process takes less than half of a second, which is within the tolerance levels of a normal, two-way conversation between two humans.

Every time an individual wakes up a Voice Assistant Smart Device with their voice (i.e. Alexa! or Siri!), the commands uttered after are stored and scored. These unsuspecting devices automatically deliver your data to a Cloud-based, voice service platform, which appends your data to a Machine Learning training data set used to increase the algorithm’s overall accuracy. Ultimately these devices will mimic human-to-human conversational interactions similar to that of two friends chatting casually in the same room. A good visual of this concept is depicted in a movie trailer for 2001: A Space Odyssey where a talking Voice Assistant called HAL 9000 converses seamlessly with the protagonist.

To get a glimpse of how close we have come to HAL 9000, Aigo, a California-based startup, produced a video comparing its breakthrough technology with that of Alexa’s. (Be sure to view the entire video.) It reveals what one can expect from these devices, going forward. Also on a similar game-changing trajectory is BMW‘s Intelligent Personal Assistant, an advanced voice activated driving experience technology. With this voice package, BMW aims to eliminate dashboard touch-screens, …the outcome of which could inspire radically different auto interior designs, in particular, for autonomous vehicles.

These Voice Assistant Smart devices are part of a larger category called Voice User Interface or VUI’s. VUI’s include any smart device or app that relies on a user’s voice for input commands. These VUI’s could be life-size human-like Robots, Chatbots, Holographs, and more. Let’s take a closer look at who is doing what in this space with the following VUI’s.

Robots, Chatbots, Holographs

SoftBank Robotics created a friendly-looking robot called ‘Pepper’ that companies such as HSBC and Carrefour use to greet their customers. Like a dispatcher, “Pepper’ directs customers to the appropriate employee or department using the language of choice. In the event of a communication error, ‘Pepper’ has an integrated, iPad-size screen for optional click inputs. ‘Pepper’s’ disarming, child-like interactions dazzle audiences, especially when it successfully greets a customer by name. Similar to its Brethren, ‘Pepper’ becomes smarter by matching images of its visitors with their corresponding Contextual Utterances. ‘Pepper’ has also shown promise with young medical patients who tend to feel more comfortable chatting about their symptoms with a robot than with an adult.

What ‘Pepper’ does offline, Avatars do online and more. Avatars interact with individuals from a web page or App using voice, text, or both. Just as ringtones gave mobile phones their customized personality, Avatars can help set the tone for interactions by mimicking a celebrity’s voice/image or simply project a voice with an engaging accent. Sapientx, a San Francisco-based firm specializes in creative Avatar designs. Their creations help humanize the Chatbot experience blending both entertainment with genuine, positive interactions. They can also inject a visual/voice branding experience that consumers can identify with personally. The firm’s white paper expands upon how their interactive branding power can transcend generations in many positive ways.

A typical Chatbot experience requires either a click or utterance from the user. Depending upon the App, a Chatbot will reply via voice or text. Users communicate directly with the activated device (via voice or text) and receive near instantaneous responses, …just as though they were interacting directly with a department head.

Bank of America’s Chatbot or Virtual Financial Assistant is called Erica. With Erica, users can learn about their personal spending habits, receive suggestions on driving financial improvements, and learn about ancillary services available at the bank. This solution is ideal for up-selling and cross-selling sales campaigns that can leverage a client’s changing needs.

Workday.com, a cloud-based business service entity offers a Chatbot to help their employees stay connected. Their Chatbot improves efficiency, knowledge sharing, and collaboration. By integrating this platform in the workplace, Workday.com employees are regularly reminded of their fundamental core values, which in turn helps promote management’s engaging culture internally.

Another interesting example came from Murphy Oil, an oil and gas company located in Arkansas. Management hired Alan AI, Inc, a Texas-based enterprise mobile development firm, to design a conversational voice Chatbot for their field engineers to deploy while checking and maintaining the company’s equipment. This ‘supercharged’ Voice Assistant gives their engineers the ability to trigger required workflows and monitor equipment progress in real-time. Other Chatbot enterprise developers at the conference included Rulai and Grid Dynamics, which is pending an IPO.

So far the industries leveraging Chatbots include Entertainment, Healthcare, Financial Services, Cable/Telecommunications and Utilities. The bulk of applications, however, are for specific internal use such as HR, expense reports, employee directory assistance, etc. Essentially they address all the little nagging items employees endure on a day-to-day basis, (i.e. recording travel receipts). Internal applications enjoy greater success because they can leverage a company’s industry jargon and finite database to achieve higher levels of speech recognition accuracy.

If you are looking for some winner applications for internal use, consider reaching out to Oracle’s Conversational Design Team. They are the group behind Oracle’s Virtual Assistant. What caught my attention at the event was that Oracle’s deliberately focuses its resources to develop internal applications that deliver 95% or higher levels of voice response accuracy. This ‘high bar’ approach has given their team the bandwidth to address more challenging issues such as multiple requests in the same ‘Utterance’, …also referred to in the industry as ‘Intent Handling’.

Finally, Microsoft delivered an impressive keynote, which included a speech to text demo from the very same PowerPoint used to deliver the presentation. As the keynote speaker, Xuedong Huang, Technical Fellow and Chief Speech Scientist for Microsoft addressed the audience in his Scottish accent English, a Spanish text version of his words appeared in real-time below his slides. Being fluent in Spanish, I could personally verify that the sub-title translation was impressively authentic. This near-flawless, application could be easily paired up with any two languages from a list of 50 available; hence, a French spoken presentation could display Chinese sub titles or vice versa. This service is currently imbedded in Office 365 and costs about $1 per hour of speech.

As though this feat was yesterday’s achievement, Mr. Huang felt obligated to awe the audience with yet another example of achieving ‘Human Parity’. This time he used his own voice print and intonations to display an image of himself conversing natively in Japanese. According to a Japanese-speaking member in the audience, the Japanese delivery also sounded authentic.

In a successful attempt to leave a lasting impression with the audience, Mr. Huang played a video of Julie White, an actress and global motivational speaker. She used a life-size hologram of herself to deliver a speech in native Japanese to a Japanese audience in Japan, …all the while she remained at her home in San Diego, California. The implications from the potential uses of this breakthrough technology both good and devious were equally startling and open for ongoing debate.

Creating Your Own Chatbot? – Key Design Issues to Consider

Before you decide on launching your own Chatbot, it would be wise to align your lofty expectations with a dosage of reality. Voice technology is much harder than it may appear. It is its own worse enemy because incremental successes tend to propagate the need for more backend technology, which in turn unleashes more complexity, often at a geometric progression. Fortunately and as testimony to the recent Conversational Interactions Conference, the industry has met and exceeded many impressive breakthroughs. However, the battle to achieve ‘human parity’ across all platforms and applications on a sustainable basis continues. Here is what keeps developers up at night.

Hardware issues… Noisy Environments can pose serious issues for Ambient Voice Exchanges. Amazon’s Echo units currently include 5 unidirectional mics that can pick up surrounding voice commands from multiple angles. More may be needed… For Chatbot apps, however, the key issue is just the opposite. Back ground noises, such as traffic or machinery need to be eliminated. UmeVoice, Inc, a headset manufacturer, offers military grade, noise cancelation headsets and ear buds that help drown out surrounding noises, allowing users to provide voice inputs in practically any situation with excellent clarity including audible whispers.

Software challenges… The backend engines that seamlessly support Voice Controlled Devices from the Cloud depend upon the ongoing advancements in Natural Language Processing (NLP), Deep Learning (DL), Neural Learning (NL), Speech Recognition (SR), Text To Speech (TTS), and a slew more acronyms yet to be named, …plus all of their respective ancillary development tools! The process is never ending…

UI/UX Design protocols… Humanizing Bots requires a deep understanding of acceptable human behavior. UI/UX (User Interface / Use eXperience) designers deliberately include facial expressions to invite positive emotions from the user, especially in the event a voice exchange fails to meet expectations. In the industry this soft yet crucial issue is known as ‘Failing Gracefully’. To appreciate the importance of “Failing Gracefully”, imagine the mounting frustrations that one would experience if the other person not only delayed their response but also repeated the same question multiple times. Natural voice exchange tolerance hinges on less than half of a second per response. Any longer and a simple dialogue between two individuals risks appearing like two separate conversations, …the likes of which would harbor irreparable frustration, distrust, and confusion.

Even with the best hardware available, voice applications can fail due to poor conversational interaction protocols. For example, the gender voice or accent used to interact with a user may come across unappealing or unfriendly. In the case of Chatbots, replies may be too long, not relevant, potentially insulting, or even ‘creepy’, especially if the Bot were to divulge into any personal details unintentionally. Even a user can fill the feedback loop with poor data by inadvertently gaming the system, either by speaking unnaturally slow or using specific terms out of context.

Conversational interactions should be fun, entertaining, terse, non-invasive, and to the point. Bots should engage with users to learn more about them without appearing overburdening. Each exchange should build upon the previous one to help profile the user. The more the Bot knows about the user, the more likely its suggestions will resonate and the all encompassing trust factor increases. In short the key challenge with designing an application is to find the balance where the Bot can gracefully and gradually extract more intel from a user, while simultaneously integrating the aggregate data to help it provide the user with more relevant and timely suggestions. This fine line of being helpful while remaining invisible and accurate is an ongoing industry challenge, …and even more so, when a growing list of backend technology issues are considered simultaneously. It is at this very tenuous and yet exciting juncture where the industry stands today.

Some of the Industry’s Greatest Challenges

Perhaps the greatest challenge for Bots is the handling of instructions that change midstream. This event occurs when a user who asks for one thing suddenly changes his or her mind in the same utterance for something else. On the receiving end, the undoing of one for command for another can cause processing algorithms to breakdown. A reset routine to handle the new request is a possible work around, however the additional processing time could create an extended delay that would exceed the half of a second conversational interaction requirement.

…and if all of these challenges were not enough, developers rightfully complain about the need to code and maintain the same voice applications for each platform, IOS, Android, Siri, etc. Often than not, these platforms will modify their API’s (Application Programming Interface) without warnings, sending developers into a mad rush to fix each application!

-###-

Tom Kadala is a technology innovator and freelance writer on topics related to artificial intelligence and machine learning. He is also the founder of RagingFX.com, a first-of-its-kind Autonomous Company.

© 2020 Tom Kadala

Did Beau Biden’s Battle with Brain Cancer Change the Course of History?  

What would the world be like today, if Beau Biden had become a cancer survivor?

His father, Joe Biden, would have run for President of the United States and very likely won the national elections in 2016. The barrage of executive orders, the divisive hatred among so many Americans, and the degradation of the highest office in the world may never have come to be.

The Biden’s never revealed the type of brain cancer that inflicted their son. By its complexity the disease might have been glioblastoma, an aggressive cancer that spreads throughout the brain with finger-like tentacles or the more common type called gliomas. Either way, every brain cancer patient is different and developing a tailored treatment for each individual continues to be the industry’s greatest challenge.

I have heard some amazing survival stories where friends and colleagues skirted death one more time after receiving a miraculous drug, cancer treatment, or inexplicable voodoo. But for all the great stories of cancer survivors, thousands upon thousands either live their remaining months at the painful mercy of chemotherapy or choose to meet their maker on their own terms. Morbid as it may sound, I had to ask myself, how come some cancer patients survive, while others do not? …and how can so many customized treatments fail, like that of Beau Biden, when others have managed to become cancer-free well beyond their predicted life expectancy?

A Google search on cancer survivors led me to a tumor-removing procedure and treatment whose statistics surprised me. The study appeared incomplete because, as stated in the footnotes, the cancer patients in this particular clinical study, never died. Ironically, the lack of life termination data left the study inconclusive. Patients who were expected to survive for less than a year were still very much alive, now 10 to 12 years later and enjoying 100% remission. The conclusions from the abstract read like the cure for a common cold and the treatment used, called brachytherapy, was easy-to-understand. Despite the astonishingly high survivability from this clinical study, however, what surprised me most was actually something else.

Brachytherapy is not new. It has been around since the early 19th century. When doctors remove cancerous tumors they carve them out in much the same way that a gardener would when removing a large weed. However, just as in the case of a gardener, its nearly impossible to pull every root out of a weed as it is to remove every last cancer cell associated with a specific tumor. The hidden remnant cells that are left behind become the primary cause for the resurgence of the disease in cancer stricken patients.

To remove these stray cancer cells before they can cause a recurrence, doctors will subscribe some form of radiation treatments, one of which is called brachytherapy. …and herein lies the difference among treatments that really work to cure a patient and those that simply prolong the agony associated with the disease.

Most hospitals will use external applications such as chemotherapy or laser beams to kill any remnant cancer cells after a tumor has been removed. Although effective, these externally-applied treatments also come with formidable side-effects such as loss of hair and extreme debilitation. In addition to these side-effects, the guarantee of a 100% cure has frequently remained elusive at best. Frankly, neither the doctor nor the patient can ever know for sure if every last traces of cancer cells have been completely annihilated.

An alternative approach to external radiation is brachytherapy. Immediately after a tumor has been surgically removed, doctors implant radioactive seeds right along the wall of the cavity. These technologically advanced seeds are custom prepped with dosages of Cesium 131, a radioactive isotope that similar to a biodegradable stitch, virtually disappears after 9 days of emitting more effective killing energy than chemo or beams can. To maximize their impact, these seeds are assembled in what is trademarked as a GammaTile®. …which is no more than a preassembled, mat-like structure that houses the required number of seeds, each spaced accordingly to emit a steady flow of localized radiation.

You might be wondering, as I did, “…why hasn’t anyone written about GammaTile® cancer-curing treatments, if indeed they are so effective?” The pure and unadulterated answer is the cost of the treatment. No, not because it is too expensive, but just the opposite. It is too affordable! A typical brachytherapy GammaTile® treatment runs about 75 to 90% less than chemotherapy treatments and other similar biological therapies.

Big pharma won’t support brachytherapy because they would sell fewer addictive and expensive drugs to insurance companies. Hospitals won’t support it because the number of visits is reduced significantly, and doctors avoid it because insurance companies won’t pay for it. The fact is that a viable cure for cancer using GammaTiles® and brachytherapy has failed to enter the mainstream of cancer treatments largely due to special interests, ironically, none of which seem to involve the better interests of the inflicted patient and their families.

Not all hope is lost. The Barrows Cancer Center expects to approve an official insurance code for Cesium 131 by July 2017, which would at the very least make it easier for doctors to get paid by insurance companies. …a bold step in the right direction!

…the steep price our society pays.
To appreciate the hidden burden our society bears to keep proven cures for  cancer out of the mainstream, one need only ponder the loss of Beau Biden. Just think, what might have happened had he survived his battle against brain cancer with a simple GammaTile® solution. His survival could very likely have changed the course of history.

The question remains, how many more Beau Bidens must we lose before the cure for most common cancers can be treated and supported as just another common ailment?

© 2017 Tom Kadala

Betting on the Brits – …my FOREX-related story

A couple of years ago I was asked to cover a trading desk at a prestigious firm in London. What transpired in the first week was quite humorous, since neither party really knew the other all that well.

During one of our daily banters at lunch break, I asked one of their top traders if it was possible for a random freelance writer, like me, to become an expert Forex trader like him and his colleagues, …in say, less than two years time? They chuckled, but soon the question turned into a friendly wager between us. They were willing to teach me the art of their trade, if I agreed to stay in London for a few months longer. Without any hesitation, I agreed. What I failed to tell them, however, (since they never asked) was that aside from freelance writing, I was also a seasoned programmer with over 25 years experience.

For the next three months, I watched two top traders work the Forex markets masterfully. In the evenings I diligently programed what I had learned into a versatile algorithm. To keep up with the trading lingo, I poured over technical trading manuals, took copious notes on their many trading styles and strategies, and carefully observed their trading behavior under many different circumstances. After a few days in the trading room, I quickly learned how these circumstances could vary widely.

Each morning we met briefly to review global events and discuss trading opportunities. Every meeting derived a different outcome. For example, one morning interest rates moved up in Australia causing investors to dump Euros and buy Australian dollars. The ripple effect triggered a similar outflow in Emerging Markets whose respective currencies sometimes reacted with greater amplitude. Within seconds, my two mentors were skillfully working the South African Rand and the Turkish Lira using Gold as a potential hedge. The Euros that were sold earlier in the day were bought back, all at a precisely calculated, risk managed profit. Later that week, changing oil prices, commodity prices, invasions, trade disputes, earnings, GDP, non-farm payrolls, hedge funding currencies, political elections, and so many other factors including the prospects for a Grexit and Brexit, weighed in. In an unpredictable manner, each event contributed to an uncanny sense of ‘controlled mayhem’ in the trading room.

Despite the daily ensued chaos from the markets, these two cool cats maneuvered skillfully through the maze pinpointing with incredible accuracy, areas with high-probability arbitrages. Like the rails on a train track, their unflinching trading discipline was solid and consistent, day in and day out. They reminded me of two sturdy pillars standing firm against a fierce tornado. It was truly amazing to watch and absorb, not just for what it did for their trading results, but more for the many ways it could be applied to daily life decisions.

Perhaps, my greatest takeaway from the entire experience could be summed up as follows:

“Add clear thinking, risk management, and a disciplined approach to any
problem and the odds for a successful outcome can be greatly improved.”

Simple? Yes, but very difficult to achieve on a sustainable basis, at least for humans, but not so for a pair of fast computers and a cleverly written algorithm. What happened next surprised us both. See for yourself at http://www.ragingfx.com.

So, you might ask, who won the wager?

As it turned out, we both did. …because we had unwittingly hedged our respective bets! They taught me everything they knew, while in two years time, I created an amazing algo that digitized everything they had taught me!

© 2016 Tom Kadala

Yes, You can NASA that!

When you need an address, a definition, or information about anything on earth, friends will tell you to ‘Google that’; however, what if your smart device or robot needed to look something up, considering they too have become a part of the ‘Internet of Things’? For them, ‘Googling that’ may help narrow some choices but the binary-like answers (yes or no) that futuristic devices will need to operate may have to come from another source altogether such as a ‘collaborative search engine’, driven and inspired by a half-a-century old institution, NASA. 

What is a ‘collaborative search engine’?

In short it is a search engine that by today’s standards is incomplete; not because a database is missing or links broken beyond repair, but because its primary source of information does not yet exist and is essentially pending discovery. Let me explain.  While Google has become the central source for all known data, (good, bad, and even ugly), NASA is emerging with an alternative search engine concept altogether. Instead of ‘crawling’ throughout the web to organize existing data the way Google algorithms do, NASA is organizing groups of talented individuals all over the world through virtual ‘Challenges’ to help it address a daunting list of unsolved problems whose collective contributions may one day make space travel as much of a business reality as airlines are today. Their global efforts will soon be centralized into a massive collection of ideas that will be in one way or another associated with NASA’s existing space data.

NASA’s Space Apps Challenges
These Space Apps Challenges, as they are called, are huge. Last year’s two-day global event, for example, broke the Guinness Book of World records for the largest ever ‘Hackathon’-like gathering with over 9,000 registered participants representing 484 organizations in 83 cities across 44 countries. At this year’s event, the number of attendees worldwide jumped above 10,000 and is expected to rise further as NASA continues to tap outside its walls for novel ideas, clever approaches, and outright brilliant breakthroughs all from cadres of scattered, talented, and unlikely groups of individuals.

This year one of their city events was held at AlleyNYC near Times Square located in the heart of New York City where a packed house of eager space aficionados of all ages, all walks of life, and every professional talent imaginable converged to inspire and get inspired. In a business-like manner, NASA’s Deputy CIO and CTO, Deborah Diaz, opened the event by presenting details of the institution’s tide-changing decision to post NASA’s gargantuan vaults of space data on the web at open.nasa.gov; …where anyone with an internet connection can access its vast contents freely. Experimental data from the International Space Station (ISS), weather data on Neptune, meteorite real-time positioning, GPS-landscape image coordinates on Mars and so much more are accessible for the connecting. NASA hopes its open-data policy will inspire groups to form organically as they often do at their Hackathons and address many of the institution’s pressing current and future challenges in space. On a side note, Diaz expressed her profound views that NASA’s open-source efforts could one day change the future of global democracies from one of ‘freedom-of-choice’ to one of ‘freedom-of-thought’.

NASA’s Challenges in Space
To help participants place space challenges into perspective, American test pilot mission specialist astronaut, Doug Wheelock, who logged 178 days on the Space Shuttle shared his views about space and space travel with participants during a press conference at the event. According to Wheelock, space is a brutally hostile environment that does not compare to anything on earth. To appreciate his perspective, imagine a place where the sun rises and sets 16 times every 24 hours, and every time the sun shines, materials such as the body of the space station or an astronaut’s spacesuit is subjected to temperatures exceeding 450 degrees Fahrenheit. When the sun sets, temperatures swing the other way dropping to 300 degrees Fahrenheit below zero. Radiation levels surrounding the space station are so high that despite the station’s thick walls, some of the 70+ laptops on board used to perform experiments may inadvertently ‘fry’. In a humble manner, Wheelock told a packed audience that NASA cannot continue its mission to Mars without the discovery of new materials that can withstand wide frequent temperature swings and intense radiation exposures over long periods of time.

Medical issues in space are another of NASA’s imperatives. Wheelock described issues with atrophy in the leg muscles, blurred vision, depression, and even loss of taste, all due to exposure to zero-gravity. Taken for granted on earth, gravity gives our legs purpose, our sight a level horizon to distinguish moving objects, our potential mood swings a sense of equilibrium, and even our mouth active taste buds. Our brains are wired to calibrate our bodily functions based on gravity levels. In a zero-gravity environment, for example, our legs become, essentially useless. In a defensive move, the brain will push blood away from the legs to the brain to allow for recalibration in a gravity-changed environment. Space station astronauts have learned to counter some of these physical anomalies by exercising their legs regularly with bungee cords, for example, but look to other sources for future discoveries and ideas on preventing potential blindness and automating cures for other unexpected and yet-to-be-encountered physical and psychological disorders and ailments.

Challenges in Space Travel
Then, there was the question about space travel; a question that just about any individual young or old would want to ask an astronaut. What is it really like lifting off from earth in the Space Shuttle, living at the International Space Station for months at a time, and taking a space walk? Here Wheelock did not disappoint.

In a candid and unreserved manner, Wheelock described the distinct noises he would hear while walking underneath the space shuttle prior to a launch. He spoke of the heaving and creaking of the massive rocket’s cylindrical shapes, which were brim-filled with liquid hydrogen. He also pointed to the constant clicking sound of the many valves used to control fuel flow. The area was in his own words, ‘its own climate’, with chunks of ice falling and water dripping off the sides surrounded by clouds of hydrogen escaping violently with high pitched hissing sounds. He gazed up at the rocket’s main nozzle knowing that at liftoff its center would reach 6,000 degrees or two-thirds of the temperature of the surface of the sun. Since no metal can withstand such high heat levels, NASA engineers designed a thickness sufficient to prevent the metal from melting completely before its final phase release. With sincere earnest, Wheelock turned to the audience once more and informed them that future space travel requires stronger and lighter materials that have yet to be discovered.

After liftoff a rocket will roll to one side to counter air dynamic forces caused by the shuttle’s stubby wings and to face its antennas toward earth. From the ground the roll looks smooth and orderly but inside, Wheelock admits, ‘it’s another world’. Once airborne the rocket rattles ‘like mad’. The G-forces he experienced are so great that reaching a switch on the controls overhead requires an immense physical effort. The vibrations and gyrations of this metallic beast forging its way against the will of nature causes the vessel to ‘rock all over’. At the earth’s orbital surface, the vessel switches to a liquid fuel and upon entering space reverts to Newtons second law of motion, a state when an object in motion will stay in motion. With a deep sigh of relief, the astronauts are finally cleared for space travel. Their vessel floats gingerly onward into the silence of space.

Inside the International Space Station, a new normal for life on board evolves quickly. Food is tasteless. The air in the station smells like the venting area of a power supply unit. The temperature is a comfortable 70 degrees and the prevailing noise of vents cooling laptops and other electronics hums at a familiar 60 MHz frequency level. The sleeping quarters are slightly quieter, while the exercise room tends to capture the smell of human sweat. Missing in the minds of the astronauts is the familiar earth scents of dirt and grass.

On the few occasions Wheelock ventured on space walks, he liked referring to the space drama film, Gravity  to illustrate his experiences. ‘It’s pretty accurate’, he said. Similar to one of the film’s most suspenseful scenes, Wheelock briefly described his own feelings when he had to release his safety cord attached to the station to complete an improvised maneuver.  For a brief moment as he pressed the button on a joy stick controlling the jet packs on his 300 pound suit (last designed in 1970), Wheelock recalls rotating around to a magnificent view of the earth with the space station out of sight. Images of 2001: A Space Odyssey flashed in my mind as he described his brief encounter of being ‘very alone in space’. With no GPS available to remotely control his automatic safe return to the station, Wheelock turned to the audience and again pointed at more areas where NASA needs help with new ideas and discoveries.

Historically NASA has always fed the industry pipeline for technological advancements. The incredible feats of lifting rockets, placing satellites into orbit, and landing humans on the moon have pushed the envelope on technological breakthroughs. The many derivative applications have created new industries, exciting careers, and a notable increase in global economic standards of living.

In an effort to address a proposed landing on Mars in the next 30 years, NASA is once again taking the lead on reinventing the future of data search engines to be developed by unlikely groups of global talent, fueled by NASA data, and created for machines to interact with other machines. Do not be surprised when, in the not-so-distant future, your friendly robot pauses during one of your voice commands to say, “Yes. I can NASA that?”

© 2014 Tom Kadala

Harnessing Big Data with a Systems Thinking Approach – (A Harley Davidson Case Study)

With 90% of the world’s data created in the last two years, what can we expect our data vaults to hold two or even twenty years from now? Today we measure our lives in peta-bytes but by 2020 estimates show a 2,300% increase in the bits and bytes that will define our lives. 35 zeta-bytes to be exact. How then can we as a society leverage the intrinsic value of so much data without getting bogged down with its complexity?

Around the turn of the century, we experienced a similar moment of euphoria when retail outlets opened ‘virtual stores’ and sold products to online buyers. A famous IBM TV ad once depicted an overwhelmed young company whose products went from a few online orders a day to hundreds of thousands. In many respects we have come full circle and are back at the starting gate of yet another era of unprecedented growth only this time instead of millions of orders, the focus is on zillions of data points.

In 2000 CEOs focused primarily on IT integration and supply chain strategies to fulfill a surge of orders. Their managers implemented the latest e-commerce packages, leveraged the cloud to reduce costs, broadened and compressed their global supply chains, and trained their workforce to adapt new work flows. Success was determined from a customer’s positive experience, measured primarily by the number of accurate and timely deliveries.

Today, the paradigm has shifted away from a transaction centric one to customer centric. Companies no longer wait for customers to buy but instead develop sophisticated algorithms that can compare a specific customer’s purchase history with multiple data sets including credit rating reports, recent purchases, and most extraordinarily, their genuine propensity to buy based upon the web pages they most commonly visit. Surprisingly, web behavioral data has become a powerful data complement that can offer unprecedented efficiency benefits to both the merchant and the consumer. Customers receive compelling suggestions, while stores inventory the products their customers will most likely purchase. It’s a win-win for both. Issues of privacy remain a sticking point for some individuals, but, as the benefits to the consumer improve, even these issues are expected to become less significant.

Striking the optimal balance will be tricky especially when the journey also involves flogging through mounds of unstructured web data. One approach being talked up within academic circles is systems thinking.

MIT’s SDM Conference – (sdm.mit.edu)
At a recent Systems Design Management (SDM) conference at MIT called “A Systems Approach to Big Data: Going Beyond the Numbers”,  Senior Lecturer J. Bradley Morrison greeted a packed audience with a refresher on Systems Dynamics; the study of how all the various components within a company (people, materials, contracts, etc), for example, interact and react together to create a product or service. Morrison’s ‘Back to the Classroom’ exercise offered new insights on how the principles of ‘systems thinking’ that today help companies scale their global operations can also be applied to leverage the new era of big data. His explanation is also testimony to the incredible versatility of ‘systems thinking’ and systems design management principles.

Morrison divided ‘Systems Thinking’ into various key areas. First off was ‘Dynamic Complexity’, which evaluates reactions when a smooth-running assembly line becomes inadvertently interrupted; for example, when a supplier’s product fails and an alternative source is unavailable. According to Morrison, unexpected manufacturing events can also have a direct affect on a company’s moral and effectiveness. The reverse is also true where systems that operate smoothly can greatly improve on what Morrison refers to as the ‘Mental Model’.

Another key area is ‘Stocks and Flows’, which Morrison dubbed humorously as  ‘Bathtub Dynamics’.  Similar to balancing the water level in a bathtub with running water, systems thinking can help calibrate inflows (i.e. inventory-build up) versus outflows (i.e. sales). The depth of the bathtub is determined by a company’s internal competitive advantage. These advantages vary widely but with regards to the alignment of systems thinking with big data, Morrison focused on skills training as a key differentiator.  He highlighted his points with a case study from a US motorcycle manufacturer, Harley Davidson.

Harley Davidson Case Study
In the late ’90s, Harley Davidson implemented lean manufacturing systems throughout its operations. Management leveraged their strong union relations to encourage employee input. The response was overwhelming. After numerous meetings, participating employees elected to improve the rotor area on the shop floor. Soon new signs went up. Space allocation was optimized, and the new employee-driven initiative became a reality. Management was pleased with their progress. The improvements paid off with an increase in productivity from 70% to 94% without the need for additional floor space. All in all the project reflected a success story until a common syndrome called ‘process degradation’ set in.

Like an ambitious diet plan, the idea reached its goal only to become unsustainable thereafter. Unaddressed issues such as an understanding of who was responsible to maintain the new process wedged away the achievements. The collaborative efforts to engage and integrate the surrounding workforce were weak and gave way to a ‘do-it-yourself’, ‘if-and-when-you-can’ approach. Despite the obvious benefits, workers returned to their old habits inhibiting further progress.

Who was to blame? …management, labor, or both?

Improving productivity with limited resources is a common problem with every company. That is why CEOs leverage technology, timely intel, and training whenever and however possible. Of these three, Morrison points to training as the greatest challenge and the most commonly ignored. Even when training is available, the type of training that he recommends is not classroom-style but rather on-the-job training.

“Learning a new skill is one thing but learning how to replace one’s old habits with a new skill is quite another,” Morrison  explained. “Workers need the opportunity to ‘change their own mental model’ before the true benefits from increased productivity can be fully realized.”

According to Morrison, managers should give their workers the opportunity to learn a new system on their own terms, regardless if it requires allocating extra time during a shift or work day — even as much as 50% more time. Unless workers are given a chance to appreciate the time saving benefits on a personal level, they will more than likely return to their old habits and simply ‘add-on’ the new changes rather than adopt them for their intended benefits.

Looking ahead…
In the next few years, new skills training will involve some form of data analytics integration. As data sources swell in every part of a business, relying on a specialized team to manage the company’s data needs will become unsustainable, especially when experts tell us that big data and data analytics, done right, depend upon the seamless collaboration and exchange of data from every corner of the company. Visionary CEOs will require every employee to learn how to collect, disseminate, compare, and use data from multiple sources. Soon-to-be, ‘unsilo’ed’ departments will depend upon each other in an entirely new manner, since the data they collect will determine the value and quality of data for the rest of the company.

Just how CEOs balance this data exchange while injecting behavioral changes among their ranks will become a number one priority for years to come. …and yet will CEOs have the foresight to allow their employees to experiment with best practices on company time? As we learned from the Harley Davidson case, those leaders that do allow their employees to adopt new behavioral changes on their own terms will more than likely achieve measurable, sustainable advantages. On the other hand, those who follow the herd by, for example, hiring more data scientists to solve their data issues, may lose an unprecedented opportunity to transform their workforce. At this juncture CEOs would do better implementing a systems thinking approach today that will allow every employee to eventually become a specialized big data provider/user for the company.

© 2014 Tom Kadala

Disrupting the Banking Industry with Big Data and Data Analytics

Bankers seen sipping away the hours over client martini lunches at upscale restaurants and posh clubs are rare these days. The slump in credit demand from the global economic crisis is part to blame but so to is the absence of ‘live’ clients. Branch offices that were once community hangouts on payday look more like empty office spaces for lease. Today bank clients ‘hangout’ virtually, while doing most of their banking online. They lurk in and out web-based services unwittingly leaving behind hundreds of data points (like footprints) that when reconstructed using data analytics algorithms can accurately reveal the client’s real identity. 

At a first-of-its-kind event in Atlanta,Georgia titled, Customer Insights & Analytics in Banking Summit 2013, representatives from various forward-thinking banks and data-analytics service companies presented their combined views to a packed room of financial professionals. Organized by Data-Driven Business (datadrivenbiz.com), a US arm of FC Business Intelligence (a London-based events company), the Summit personified the past, present, and future of banking.  First, it exposed the ugly truths characteristic of a complacent banking culture mindset.  Then it highlighted the extraordinary accomplishments from early-adaptor banks, and, finally, it unveiled a fantastic prediction on how banking could potentially hold the keys to unlocking the value of social media feeds from Twitter, Facebook, and other similar web-based services.

With off-the-shelf, data analytics, software tools, bankers can gain an accurate 360 degree view of their customers on an individual basis just by matching a customer’s banking data (i.e. loans, credit card purchases, investments) with their behavioral patterns online. The technology used to integrate data sets to match behaviors with individual names has advanced remarkably, so much so, that bankers can calculate with reasonable accuracy the ‘lifetime value’ of each customer. This magical step has been demystified by over 150 vendors who specialize in the science of Digital Data Integration or DDI. DDI connects numerous disparate data sets both structured and unstructured using assigned ID numbers. Expert companies in this area include Aster (asterdata.com, a TeraData Company), Actian (actian.com), PrecisionDemand (precisiondemand.com), Convergence Consulting Group (convergenceconsultinggroup.com) and Actuate (actuate.com, a BIRT company). The principle reason bankers want to segment their customers by their future income potential is to allocate their limited resources more efficiently.

Banks that fully integrate their operational data with unstructured social media streams will become the game-changers to watch. Already the Old Florida National Bank boasts of their younger and more agile management team (under 43 years of age) who credit their surging asset growth in the past four years to their data analytics initiatives – (from USD$100m to $1.4b). Their team has the proper bank culture, mindset, and know-how to implement data analytics tools that fully capture a digitally-holistic view of their customers. By mapping where their customers spend most of their time and money, management can target more relevant and timely offerings. Targeted customers unwittingly respond with not only a buying interest but also a willingness to refer the bank to a friend or colleague. …truly a win-win for all.

SunTrust Bank, also based in Florida, uses data analytics to determine not only the location of their next branch office, but also the optimal management qualifications required to operate one of their branches. Another interesting case study came from Wells Fargo. Their data analytics team integrates thirty-two data sets (from both internal and external sources) and presents the results in a customized dashboard format to their managers company-wide. Managers use the service to make better decisions, present data on an ad-hoc basis at meetings, and self-serve their specific research interests using a number of additional data visualization tools for non-techies. The tools they use are off-the-shelf Business Intelligence or BI software packages provided by companies such as Oracle (oracle.com/BI), MicroStrategy (microstrategy.com) and Tableau (tableausoftware.com).

Servicing a more digital client-base has come with its many challenges as well as with its unexpected opportunities. For example, credit bureaus that traditionally deny 96% of consumer credit requests often reject qualified candidates. Using data analytics tools, however, banks can integrate comparative behavioral data with a candidate’s payment history and reassess their risk profile accordingly. The results would qualify more loans that would otherwise have been turned down. Other exciting ways for banks to grow revenues include working with real estate brokers. Banks can determine which of their clients is most prone to purchase a new home and pass the list on to an agent. Agents seeking better leads will more than likely recommend mortgage business back to the bank that shared their intel.

One can just imagine how many more ways bank data can play an integral part in helping companies find their most likely customers and future business. Banks already manage the transactional data in-house and are rapidly gaining the business intelligence experience needed to integrate their customer’s behavioral data and compare their profile with their peers. Under this scenario, one might wonder why any business would not want to work with a bank that not only understands their business but also delivers buying customers.

With this much real-time intel available on customers in one central location, could banks one day become the primary lead source for their business clients? Could this new normal become a significant game-changer in the banking industry?

Despite a rosy future, the business world is not waiting for banks to embrace data analytics any time soon.  Competitive trends point to a number of threats including retailers such as WalMart who will be offering banking services directly to their customers at their retail outlets.

There is also the emergence of the ‘digital wallet’, which for the time being focuses on reducing the clutter of credit cards using available smartphone technology. Eventually one company will umbrella all credit card transactions and offer global behavioral tracking intel. Pioneers on the forefront include Protean Payments (getprotean.com), a recent startup that plans to use bluetooth technology to replace card swiping at  terminals and Wallaby (https://walla.by), a company that helps cardholders maximize points earned prior to making a purchase.  There’s also Ebay’s PayPal (paypal.com), which has released a debit card concept, which it hopes will entice developers worldwide to promote their data analytics services to SMEs.

In online banking, Simple.com does not have a physical presence nor charges the customary fees that traditional banks do. In fact, they offer plenty of financial management reports and suggestions at no charge. …all online, of course. How they make money is best understood when opening an account. Simple.com new accounts cannot be opened unless one is willing to accept ‘cookies’ on their computer, a permission which releases away a user’s complete web history to a third party. Their insistence suggests that they place a greater value in a customer’s behavioral online data than they do in their banking business.

If Simple.com succeeds, could their new business model significantly change the way consumers perceive a bank’s value proposition?  Will consumers demand additional compensation for allowing access to their behavioral online data, since the data is worth more than the interest paid on deposits?

For now, banks who are looking at data analytics for the first time and wondering how and when to take the plunge should heed practical advice from experts who spoke at the event. One individual concluded that for now, those new to data analytics should start with the data they already have and use predictive findings from data analytic tools to start a conversation rather than formulate targeted recommendations.  This advice and the rapidly evolving changes in both consumer and commercial banking remind me of the famous Aesop’s Fable about the race between the tortoise and the hare. This time, however, the winner may be a third and invisible participant called ‘Big Foot’ representing Big Data and Data Analytics.

© 2013 Tom Kadala

Could PayPal become the Global Reserve for Cash and Data?

As PayPal continues to reinvent itself, expect the mother of all disruptions, a global currency comprised of cash and data. Similar to how voice and data coexist over the same copper wire today, PayPal’s next move will co-mingle cash and data over a shared platform. Instead of bits to sound bytes, however, PayPal hopes to seamlessly integrate customer and peer data (in the cloud, of course) and deliver customized business intelligence across multiple platforms to small business merchants all over the world — right when they need it most. There is one catch. Every merchant transaction including credit cards would have to involve PayPal.

Last July at a PayPal sponsored ‘Battle Hackathon’ event, which took place at AlleyNYC (alleynyc.com) near Times Square, over 100 local software developers worked through the night in small groups to create a new ‘killer app’ of their choice for a chance to win a $100,000 grand prize. This event was one of ten stops along PayPal’s world tour, which included Barcelona, Berlin, Moscow, Seattle, and Tel Aviv. Throughout the night, PayPal’s minions were on-hand to help developers integrate a list of special access APIs (Application Programming Interface) into their code. These APIs offer developers controlled access to PayPal’s databases. Aside from identifying worthy programmers for hire, PayPal uses these Hackathons for feedback on their growing library of APIs. While attending, I caught up with their Global Director for their developer network, John Lunn.

A former marine biologist who compares PayPal’s membership behavior to schools of fish, Lunn shared some eye-popping statistics from PayPal’s extensive databank.

  • 65% of items purchased in a retail store have been researched prior on the Internet.
  • 43% of browsers at a retail store actually make a purchase.
  • 37% of shoppers who price compare in the aisles using a smartphone App, complete their purchase online later,
  • On average 15 year-olds will remain on a retailer’s web page for less than 6 seconds.

“You have to be where your customers frequent”, claimed Lunn who strongly believes that the future of the web is with mobile devices, especially since market near-term predictions for mobile payments are upwards of $20b. Already a prominent player, PayPal expects to process $7 billion in mobile payments next year, which is 10 times more than its volume two years ago.

The increased payment activity has PayPal eyeing the customer-specific,  behavioral/buying-preference intel that can be extracted from the transactional data. Rich in details, this harnessed data could become a game-changer for small retailers.

“Without data, you actually know nothing about the consumer,” Lunn exclaimed. Conversely, with data, a merchant can react or address a customer’s wants and needs at a lower cost. Showing a customer what they will most likely purchase based on their personal profile and peer comparisons can make every aspect of running a business immensely easier and efficient. From marketing, sales, inventory control, retail space, and employees on the floor — every improvement that is based off better business intelligence derived from rigorous data analytics and self-teaching algorithms will have a lasting impact on the rest of the business as well as for its corresponding supply chain.

A customer’s buying experience is important too…  

“Buyers no longer want to wait in line,” Lunn notes. …and why should they if technology can enable them to simultaneously step up to the same checkout counter. Lunn used Jambo Juice as an example of how PayPal card holders can order up their favorite drinks from their mobile device and use face recognition to verify their purchase in the store. There’s no waiting around since drinks are prepped in time to be picked up. Watching a worker cut up vegetables and blend a customer’s health drink was once perceived as fresh and worth the wait. Not anymore. Consumers value their time as much as they do the products they buy.

With buyers who are far more knowledgeable of products than ever before, the only line of defense available to merchants is a deeper understanding of their customer’s buying habits. But knowing what a customer purchases in one store is not enough to make a difference. Merchants need access to richer and timely intel about their customers and their peers, not just what they bought recently from them, but elsewhere too, with other merchants, on or offline, locally and globally.  With access to this much data, merchants could target their best customers and provide them with exceptional service especially during the few minutes a customer spends at the check-out counter. For example, once a face is recognized at the register or an account number entered, hundreds of points of data could be co-mingled, correlated, then calculated instantly between PayPal and the merchants database to extract a customized product recommendation such as a special offer or custom-printed coupon booklet. Each timely recommendation would help build a stronger bond with the store’s brand.

Integrating into a merchants database or CRM system requires an army of developers. PayPal knows this fact and hopes that its easy-to-use APIs will encourage developers to include PayPal with their client’s transaction processing needs. PayPal’s inclusion would do away with the ‘clunky terminals and expensive equipment’ many merchants use today to process credit card payments. However, to make PayPal’s ambitious business intel plan really work, every merchant on the planet would have to become a PayPal member.

Could PayPal become the Global Reserve for cash and data?

To appreciate PayPal’s shrewd and brilliant strategy, pick up a copy of a fascinating book titled, “The PayPal Wars” by Eric M. Jackson. The author explains in compelling, narrative detail how the simple idea of helping world economies through job creation, prosperity, and world peace is hinged upon merchants trading freely and seamlessly across borders. If merchants in The Congo, for example, could sell their goods as easily to a local buyer as they would to a buyer in New Zealand, their improved cash flow would help strengthen their local economies and grow their businesses.

PayPal’s past success was predicated on the individual support of its very members. When eBay tried to replace them with an in-house solution called BillPoints, PayPal’s members rebelled. …and after many other similar competitive encounters, members could indirectly claim a personal stake in PayPal’s ongoing success. Their formidable presence overwhelmed even their craftiest challengers. Time will tell if PayPal’s loyal customers will once again help them forge on with their ambitious quest to become the Global Reserve for cash and data.

© 2013 Tom Kadala

To Byte or Not to Bite: The Myths, Realities, and Trends behind the Science of Big Data Analytics

Without data, a company would never survive in today’s global environment. With some data, it might have a fighting chance, depending upon the quality and timing of the information.  But what happens when a company has access to too much data, sometimes referred to as ‘Big Data’? Ironically, it too could go out of business even with the best technology and staff to manage it.  Why? …partly because the data’s ultimate value depends upon who interprets and communicates the recommendations to the rest of the company, a task often left to an internal employee or ‘Data Scientist’ who may be no more than a recent university graduate armed with theories and little industry practice.  

According to Dr. Jesse Harriot, the Chief Analytics Officer at Constant Contact and author of “Win with Advanced Business Analytics”, “setting up a data analytics initiative within a corporation is not a trivial endeavor”.  It requires a lot of sponsorship at the corporate level and can take a year or two before achieving a meaningful balance between the influx of web data and its collective value to the company. Harriot shared his wisdom at a recent conference in Boston titled, The Science of Marketing: Using Data & Analytics for Winning”. This power event organized by MITX, a Boston-based, non-profit trade association for the digital marketing and Internet business industry – (mitx.org), served up an impressive venue of expert panelists who shared their best practices and experiences.

Among them was a star performer, care.com, the largest online directory that connects those in need of care with care providers. Their co-founder and Chief Technology Officer, Dave Krupinski, discussed how the company uses analytics to drive all aspects of their marketing function including, attribution analyses, customer segmentation, user experience, and predictive analyses. As Krupinski explained to a packed room of 300+ professionals, “most CEOs blindly jump into ‘big data’ analytics expecting immediate returns, only to discover (and after great expense) the many intricacies required to get it right.”

Is ‘big data’ analytics really worth the trouble?

If economic times were healthier then maybe not, but with a slowing economy, companies are forced to either come up with the next differentiating product/service that will give them an extra edge over their competition or figure out better ways to surgically target likely buyers based on real-time data. But, increasingly, fickle-minded consumers whose loyalties remain largely unpredictable have made the task exceptionally challenging. …and yet, no one can blame consumers for their lack of brand loyalty when on average they are bombarded with over 500 ad messages per day.

A Typical Corporate Scenario
In a mocked up example for discussion purposes, a typical CEO hires a ‘Data Scientist’ or promotes someone from IT, after reading positive reports from companies that have boosted their sales using ‘big data’ analytics. Once budgets are allocated and a team is in place, software with funny names such as Hadoop, MapReduce, and HAWQ appear. These packages digest massive data sets (mostly unstructured data from the web) and respond quickly to complex SQL queries entered by a team operator or analyst. The output is then parsed into a more visual friendly format perhaps using expensive Business Intelligence (BI) software and when ready, shared at weekly management meetings. For this example, the meeting is adjourned without much warning. Management felt that the results from the Big Data Analytics Team were not aligned with corporate priorities, a common problem that points part of the blame on the Data Scientist’s poor understanding of managements business needs and on the CEO for not creating a comprehensive, formal data governance.

Disappointed CEOs tend to view ‘big data’ analytics as a ‘think-tank’ style department that delivers flawless dictates to the rest of the company, when in fact, ‘big data’ analytics should be a collaborative data-sharing effort among all departments.The secret of getting ‘big data’ analytics to work is less about massaging structured and unstructured data quickly behind closed doors and more about the timely reintegration of field data from every department to continually tweak predictions and outcomes.

What should a CEO do to encourage data sharing among departments?

Most department heads do not share their data with their cohorts either by choice or due to incompatibility issues.  To address this reluctance, a CEO should first explore a standardized database structure and data exchange format that would allow departments to share their data seamlessly. Next he or she should develop an incentive plan to encourage staff members to not only share their data but request data from others. The fewer restrictions imposed on inter-departmental data exchanges, the more likely, new ideas will blossom. Moreover, the positive behavioral changes in the workforce will help the data analytics team stay focused on corporate priorities. Keeping internal operations lubricated with both internal and external data analytics will boost a company’s revenues by default. This approach can lead to a passive revenue strategy that focuses more on balancing an operation guided by ‘big data’ analytics than relying on traditional consulting advice or CEO hunches.

A Five Stage Journey
I turned to a visiting professor at the Harvard Business School, Tom Davenport, to categorize the ‘big data’ analytics journey a CEO can expect to take. Davenport listed five progressive stages needed to achieve ‘big data’ competence in today’s business environment. First, there are the ‘Analytically Impaired Companies‘. These are companies that have some customer data but lack a centralized strategy to leverage its use.  Next up are the ‘Localized Analytics’. These entities outsource their data needs to companies that follow traditional marketing practices. Then come the ‘Analytical Aspiration’ types who centralize their data sources, enjoy C-level support, and operate an in-house data analytics team. At this level companies are just beginning to grapple with their ‘big data’ analytics issues. A fourth phase has been designated to ‘Analytical Companies’ who are showing some success in using data to drive their business. Finally, and at the top of the heap are the ‘Analytical Competitors’. These companies have fully integrated proven algorithms that combine unstructured web data, with reintegrate field data to seamlessly predict a specific customers expected wants and desires based on their personal past history with the company and elsewhere including the same for their closest peer group.

Most daunting to any CEO is the notion that companies ranked at Davenport’s ‘Analytical Competitors’ level can rely almost entirely on their algorithms to run their business. The indisputable outcomes dictate their level of ad spend per quarter, allocation of ads across multi-platforms, inventory levels per SKU, quality of maintenance support, head count, and so much more. At some point one might even ask what the role of management should be for a company ranked ‘Analytical Competitor’ and the talent/expertise needed to be an effective CEO in this soon-to-be, new normal.

© 2013 Tom Kadala