Rutgers Big Data Certificate Program

Data analytics is great for businesses, but it’s also great for saving lives

4/27/2017

23 Comments

 
Picture
Big data analytics is the proverbial catnip for businesses looking for a competitive edge in the market, and it is easy to see why. The ability to provide more granular analysis of large data sets is helping businesses distinguish themselves from their peers, by providing actionable insights faster than ever before. Businesses can then act upon these insights to improve their efficiency, customer service, and profit margins.
But it is not just about helping to grow profit margins and increase market share. Analytics can be applied to any data set, no matter how large, and you can discover insights to help improve many data-rich applications from smart cities to healthcare. One could argue that it is in the latter where data analytics is providing the ultimate benefits, by helping to save lives.
PATH is a non-profit organisation that uses innovations in technology to save lives and improve the health of those in need, especially young women and children. It tackles a wide range of health issues through entrepreneurial cross-sector partnerships that help develop powerful tools and strategies that can make a difference on a massive scale.


With nearly half of the world’s population at risk of malaria, organisations worldwide are continuously looking for new ways to combat this global health issue. Over the past 15 years PATH has contributed through its various programmes to saving 6.2 million lives, but despite this malaria is still endemic in Africa; the World Health Organisation reported approximately 212 million malaria cases in 2015, with young children and pregnant women particularly vulnerable to the disease. Malaria takes the life of a child in Africa every two minutes.
Thankfully, a new PATH project is making great progress to defeat the disease by putting data centre-stage. “Visualise No Malaria” is working with the Zambian government to harness big data analytics and the cloud with the aim of eradicating the disease from Zambia by 2020.
The technology has transformed the efficiency and response times of Zambia’s National Malaria Elimination Center. Coordinators once relied upon reports from health centres and health visitors on bicycles to target the deployment of its limited resources: insecticide-treated bed nets, indoor residual spray, rapid malaria tests and drugs. Now, the project led by PATH and data visualisation software provider Tableau engages a number of different technology providers including EXASOL to quickly analyse data and produce maps that track how the disease spreads. This means coordinators can allocate resources across the country accordingly and prevent outbreaks.


The data visualisation works by mapping geospatial data, using data such as elevation and slope, combined with hydrological features such as topographic wetness and stream power, which PATH’s scientists use to create a very precise, accurate map of water courses and therefore where mosquitos are likely to breed. They are working to combine this with meteorological models of precipitation and temperature and data about disease outbreaks to better hone future analysis and allow health officials in Zambia to send resources to areas with the highest probability of malaria outbreak.
We are delighted that our partner Tableau has involved EXASOL in this project, in which our role is to provide the computational power of our fast analytic database through Amazon Web Services. Through PATH, health professionals are able to perform highly complex queries of not just “big data” but “massive data”, at speeds that enable almost fast rendering of maps and dashboards for interactive analysis. This allows the data to be understood and actioned immediately in the field.
This PATH-Tableau project is the first time that cloud-powered big data analytics has been combined with the know-how to tackle malaria in this innovative way, and if “Visualise No Malaria” proves successful in eradicating the disease from Zambia within the next three years, it will provide a blueprint for other countries in sub-Saharan Africa.
It’s tantalising to consider the possibility that other fast-spreading illnesses could be tackled by looking at how data has been used to help combat malaria. This means that, should another health epidemic similar to the bird flu and Ebola outbreaks of recent years occur, we would be able to analyse those datasets and use visualisations to predict how it will spread and how fast, and then draw up plans to target treatment and contain the outbreak.
We have seen real time analytics power huge advances in many industries, from optimising online gaming to keeping the rails stocked in fashion shops, but when it delivers actionable insights in time to prevent loss of life, it’s a humbling reminder that data analytics can serve a bigger cause.

​Source: http://www.cbronline.com/news/big-data/analytics/data-analytics-great-businesses-also-great-saving-lives/

23 Comments

Modern monitoring is a big data problem

4/26/2017

26 Comments

 
The rate of change and degree of diversity in the IT stack demands fine-grained and frequent monitoring, which turns monitoring into a real-time big data problem
Picture
Why did VMware acquire Wavefront? The start of the answer to this question comes with an understanding of what Wavefront is (or was). Wavefront was started by former Google engineers who set out to build a monitoring system for the commercial market that had the same features and benefits as the monitoring system that Google had built for itself.
Due to the massive scale of Google, such a system would have to have two key attributes:
  1. The ability to consume and process massive amounts of data very quickly. In fact, the Wavefront website make the claim, "Enterprise-grade cloud monitoring and analytics at over 1 million data points per second."
  2. The ability to quickly find what you want in this massive ocean of data
So, it is clear that the folks at Wavefront viewed modern monitoring to be a big data problem, and it is clear that some people at VMware were willing to pay a fair amount of money for a monitoring system that took a real-time and highly scalable approach to monitoring.


Why is modern monitoring a big data problem?Rather than just assume that VMware and Wavefront are right about the idea that modern monitoring is a big data problem, let's look at the underlying trends in the IT industry to determine what is changing monitoring in this manner. It is not enough that Google (and by extension other public clouds like Amazon Web Services and Microsoft Azure) have a big data monitoring problem. The question at hand is whether or not monitoring for the typical enterprise has become a big data problem.
The key insight is that the IT environment today could not be more different that it was as recently as five or 10 years ago. Ten years ago, there was one language (Java), and an application ran on one operating system, which ran on one physical server. Applications were updated as infrequently as possible, and changes in general were made as infrequently as possible.
 The image below depicts the typical "IT stack" at an enterprise today.
Picture
What is so different about the modern IT environment is the following:
  1. In response to an unlimited demand for business functionality implemented in software, agile development and DevOps were invented to speed code to market.
  2. In response to the same pressures, new languages and run times were created.
  3. The above two changes lead to a very diverse application stack, with frequently arriving and changing applications.
  4. At the infrastructure layer, everything (compute, networking and storage) is virtualized and is often subject to automated management.
In summary, the modern IT stack now consists of very diverse application stacks, with a rapid rate of change (many changes per hour) running on an abstracted and dynamic infrastructure.
How does monitoring have to respond?If you look at how monitoring is done today, it really has not changed in response to the changes in the IT stack listed above. Monitoring today consists of many different vendors, each collecting a slice of the total data, analyzing it, alerting on it, displaying it in dashboards and providing in reports.
Now, it might be tempting to try to find one vendor that can monitor that entire stack for you. Before you go down that road, remember what happened the last time that was tried. It was called Business Service Management with offerings from BMC, CA, HP and IBM, and it failed miserably (20 years ago) because even then the pace of innovation was so high that each vendor could not keep up. So they acquired companies to fill the gaps in their product lines, and they were never able to integrate them, which resulted in the mess, which in turn resulted in the failure of the BSM suites.
So, the first very important realization is that due to the accelerating pace of innovation in the industry, monitoring must remain a multi-vendor problem. This is so because various parts of the monitoring problem are "whole company" problems that require a significant investment of capital in intellectual property to solve.
Monitoring must also generally change to embrace the following principles:
  1. If the stack is diverse (especially at the application layer), then each component and layer of the stack needs to be monitored. 
  2. Transactions need to be monitored from their inception to their end in the application system (browser to database and back again).
  3. The entire stack needs to be monitored from the top of the the stack to the infrastructure (browser to hard disk or storage device and back again).
  4. So, the number of things that need to be monitored increases dramatically.
  5. If the environment is dynamic due to frequent changes at the application layer and automation at the infrastructure layer, then monitoring needs to be much more frequent. Every five minutes is no longer frequent enough. Every minute is no longer frequent enough. 
The big data approach to monitoringIf we accept that monitoring is a multi-vendor problem due to the diversity of the problem, and we accept that the granularity of monitoring and the frequency of monitoring must increase due to the dynamic nature of the stack, then monitoring is a real-time multi-vendor problem. 
There are then two approaches to implementing real-time big data monitoring:
  1. Have every vendor integrate with every other vendor and try to maintain a nightmare of a compatibility matrix.
  2. Have every vendor integrate with a common high-performance, big data back end especially built for the real-time multi-vendor monitoring problem.
SummaryThe diversity of the modern application stack, and the rate of change at both the application and infrastructure layers, requires that monitoring become more granular and more frequent across the multiple vendors required to cover the IT estate. This turns monitoring into a multi-vendor big data problem.

​Source: http://www.networkworld.com/article/3191479/software/modern-monitoring-is-a-big-data-problem.html

26 Comments

How Big Data Will Help Your Small Business

4/25/2017

1 Comment

 
Picture
One of those business terms that may sound like yet another meaningless buzzword is “big data”. While big data may appear to be a nebulous term, it can actually have very concrete implications for a business that can help it become more competitive for the long term.
Big data refers to the collection of massive volumes of digital information that can later be analyzed by powerful software. It can be used to improve business processes in nearly every department within a company. According to a report by Information Week, spending on big data by businesses will soon reach $187 billion a year.
With that in mind, it may seem like big data is something that can only be taken advantage of by the largest multinational corporations. That is simply not the case. Below are some of the ways big data can also help small businesses.
Improved Operational EfficiencyOne of the biggest benefits for small businesses in harnessing the power of big data is being given the ability to improve the efficiency of many different business processes. If your business uses a factory to create products, for example, sensors can be set up on the production line to collect data regarding production workflow that can then be sent via Wi-Fi connections to a central hub or even to a third party cloud hosting supplier.
Later, big data analytics software can access this information and analyze it to give you an overview of the efficiency, or lack thereof, in your factory. You should be able to use this analysis as a source of intelligence on how to alter the production processes to lower costs and shorten turnaround time. According to a report from the United Kingdom, big data analytics will lead to an increase in efficiency in the manufacturing sector of about 14.7 billion pounds.


Improved Customer ServiceBig data can also provide significant improvements in customer service that can greatly benefit small businesses. In the old model of customer service, a customer calls in with a complaint regarding a product or service, and the customer service representative has to try to solve that person’s issue from scratch with no additional information to draw upon. That is not the case with big data. Thanks to big data platforms, every single interaction a customer has with a company can be preserved indefinitely. This information can then be pulled up instantly when a specific customer calls in.
This allows that service representative to provide tailored serviced based on that customer’s history. He or she will know exactly what kind of issues they dealt with in the past. With that in mind, much better and effective service should be supplied thanks to having immediate access to this information. The platform may even be able to predict the likelihood certain issues based on the data alone before the customer even calls in.
More Efficient Inventory ControlBig data can also make a big difference for a small business’s inventory control procedures and warehouse protocols. If your goal is to create an automated warehouse, big data can certainly help make that a reality. With big data analytics software, shortages in stock can be predicted days, weeks or even months before they occur. Alternatively, the same data will also prevent overstock from being ordered. Instead, a sweet spot will be achieved in which just the right amount of stock is in the warehouse at all times.
Without the ability to collect and analyze as much data, the task of inventory control isn’t nearly as efficient. That can be a big deal in regards to profitability. Ordering overstock or running out of stock costs companies as much as $1.1 trillion a year.
Overall, big data can help a small business achieve greater profitability through increased efficiency. In competitive markets, you need every advantage you can get. Your own data is a valuable resource you shouldn’t let go to waste.

​SOurce: https://datafloq.com/read/how-big-data-will-help-your-small-business/2956

1 Comment

Opinion Using big data to improve customer relations

4/24/2017

11 Comments

 
It is common knowledge for companies today that harnessing the power of big data it essential for building a successful business model. The rapid growth of digital communities and automation in the workplace has led to major changes in the way that companies communicate with consumers.
From understanding market trends and fluctuations to segmenting consumer data to create more personalized marketing campaigns, big data has resulted in more sophisticated customer relations that have changed the way business interact with users on a day to day basis.
Picture
Build trust by staying competitive
When launching a new product or building up your business plans, companies need to be prepared for the inevitable ups and downs along the way if they are going to stay competitive. Keeping up with the competition doesn’t only mean that you are considered to be a more popular or successful brand, it also shows your customers that you are trustworthy:
● Data enables companies to find the optimal price that will be both desirable for consumers and competitive without going overboard, potentially causing potential consumers to look elsewhere.
● Utilizing big data’s expansive reach enables companies to analyze past trends in the market, such as seasonal fluctuations and adjust their finances accordingly.
● Understanding market trends enables you to recognize what your customers need, thus you can expand your product offers, providing package deals or promotions based on these needs.
Thanks to more advanced data analysis and the subsequent rise of professionals specializing in market research, e-marketers and sales teams are now better equipped to handle fluctuations in the market, which has, in turn, created a more ambitious business landscape.
Encourage customer loyalty through advanced marketing
Big data’s influence on the way that companies market products today has been revolutionary. Not only can e-tailers curate individual marketing campaigns by segmented target markets based on gender, age and location, they can also provide tailor-made recommendations based on these variables.
There are various ways that businesses can make meaningful suggestions, which encourage customer loyalty based on predictive modeling:
● Making the most of cluster analysis, which extracts groups that possess similar attributes, companies can create specific campaigns that can be marketed to each group depending on their location, age group, and gender.
● Data exploration enables businesses to look at consumer behavior through the lens of past histories and statistics, thus translating this information into more personalized campaigns that could attract new customers.
● Implementing automated customer services, such as chat bots can help businesses stay up to date on customer interactions and browsing history, which can then be transformed into customized messages and services.
Whether you are a small company or a large startup, making the most of the available data can help your business create more purposeful marketing campaigns that boost customer loyalty at any stage of development.
Boost brand awareness
Today brand awareness is much more than just executing an eye-catching logo, it’s about telling a memorable story. Data enables companies to see what types of customer interactions reinforce positive communication. Advancements in technology, such as automated customer services and machine learning has led to more accurate and detailed market research, which have resulted in businesses increasing their customer loyalty.
Some of the ways that brand awareness can help to attract customers are:
● Consumer research can help companies to develop their brand to fit customer needs, increasing impact and generating awareness.
● Detailed market research enables companies to monitor consumer activity which can be used to create customized offers that make memorable impressions on potential customers.
● A strong brand identity can result in more customer conversions, as it demonstrates confidence and professionalism, which engenders trust.
There’s no doubt that using data to construct your marketing campaigns and sales pitches are leading to more advanced techniques that increase revenue and customer loyalty.
Overall, utilizing big data complements our human ability to connect and build meaningful relationships. Companies today can utilize advanced data research to relate to their consumers, improving their services and long-term growth. 

​Source: https://www.information-management.com/opinion/using-big-data-to-improve-customer-relations

11 Comments

Design Sprint: The Link Between Lean Product Management and Design Thinking

4/23/2017

1 Comment

 
Lean product management and design thinking are some of the most innovative approaches to launching a product in the startup world today. In the language of entrepreneurial incubation, design thinking refers to a designer’s approach to creating products; while lean product management tilts heavily towards an engineering-based perspective. While the question is not which approach is the most effective, product managers are often faced with the challenge of having to decide the strategy to adopt at different stages of the product development life-cycle.
Is it better to begin with the lean methodology of using data generated from potential customers to iterate quickly? Or is it better to begin with design thinking – understanding customer “pain points” first in order to develop innovative solutions? The choice certainly doesn’t seem easy.
Lean Product Management

​​
Picture
Lean manufacturing principles were first applied in the early 1970s by Toyota’s automobile plants as a strategy for improving the efficiency of processes by cutting down on waste. The lean principle has since been applied in “non-manufacturing contexts”, particularly with startup entrepreneurs seeking innovative ways to launch a product. The lean strategy is a customer-oriented approach aimed at developing products that are well-suited to customer needs by continually engaging them throughout the product development cycle.
The lean startup model paints the following picture:
You’re out of your office in search of prospective customers to test your idea with a keen eye on the market. You get feedback from them with regards to their preferences for the product and then you use the data gathered to iterate as quickly as possible. You continue this process, gleaning lessons from failed experiments and using minimal resources to launch the product and ship out as quickly as possible.
Design Thinking Strategy
Picture
Believed to have been developed in the late 90s, design thinking is associated with processes aimed at developing customer-centered products that evolve from getting to know customer’ needs through a form of “participatory research” and then working to provide innovative solutions to those needs.
It emphasizes spending extensive time with potential customers in communities in order to discover their “pain points” so as to become driven by empathy to provide innovative solutions to their identified problems. As such, what you have in design thinking is a value proposition with an emotional attachment; which means you are not only concerned about the product meeting customer needs, but you also want to know how they’ll feel using the product. This drive therefore ensures that your customers don’t just get a value for their money from your product but are also emotionally satisfied with it.
What are the differences?
With regards to project initiation in lean product management, the business idea is already determined from the onset (hypothesis) before it is tested with target customers to ascertain its validity. As such, it can be changed if it is discovered that it does not suit the needs of customers, or it can be modified to fit in. On the other hand, in design thinking, the idea is not determined from the beginning because the problem to solve is unknown at the start of the project. It is conceived only after research is conducted with the target customers and after feedback is received from them.
The lean model mostly  adopts a quantitative evaluation technique to test hypotheses when conducting research with the target customers. It emphasizes using “actionable metrics” to evaluate the effectiveness of processes and solutions developed by providing useful analysis on key performance indicators (KPIs).
Design thinking on the other hand adopts a qualitative approach to research. It emphasizes participation with potential customers in the study in order to experience the emotional attachment they would have towards the product. As such, various “sophisticated” qualitative approaches such as Personas, 2-Axis Mappings, User Journeys, or Causal Maps are adopted to elicit response from the target customers.
Picture
What are the similarities?  
Both approaches are customer-centered, which means both focus on developing innovative products that solve specific problems for customers. They both involve using feedback from customer research to iterate quickly and launch products with a minimal amount of time and financial investment.
This therefore implies that both approaches would help you quickly develop not just innovative products, but products that meet customer’s specific needs. But the challenge is how the two strategies should be adopted at various stages of the product lifecycle.
Where do they Intersect?
The lean methodology has an embedded structure that helps you test hypotheses quickly at minimal cost. But the question is – how to generate your hypothesis? It isn’t quite easy to discover the needs of customers if you haven’t identified their “pain points” by being in their shoes. This is where design thinking comes in. It provides you with the framework to do that.
Stefanos Zenios, Professor of Entrepreneurship at the Stanford GSB Investment Group of Santa Barbara gives a compelling description – “For example, in design thinking you develop a prototype that you use to get feedback — that’s very qualitative — and lean startup makes it more rigorous, so you don’t end up convincing yourself that the feedback is positive feedback.”
Zenois emphasized the fact that getting to know the paint points of customers does not come easily; it involves extensive time spent on interactions with them. He cited the example of how DoorDash, an online-based startup that offers meal delivery services, came to discover their customers’ needs (i.e. customers want their meals delivered to their offices) by interacting with a woman who runs a popular macaroons restaurant.  
Design Sprints
Picture
Design Sprints, developed by Google Ventures, represents the bridge between lean product management and design thinking. It has a framework through which an idea can be generated and solutions provided by the end of the week.
According to Jake Knapp, author of the book, SPRINT: How to Solve Big Problems and Test New Ideas in Just Five Days, “the sprint is GV’s unique five-day process for answering crucial questions through prototyping and testing ideas with customers. It’s a ‘greatest hits’ of business strategy, innovation, behavioural science, design, and more – packaged into a step-by-step process that any team can use.”
Don’t be left out
Lean product development methodology and design thinking have traditionally created not just disruptive products, but also products loved by customers. With the linking of the two models comes design sprints  which are one of the latest frontiers in the world of entrepreneurial incubation.
Don’t be left out of the revolution. Plycode’s Design Sprint Program allows you to discover, prototype, test and validate your product idea with real customers in just one week.

SOurce: 
https://www.plycode.com/startups/design-sprint-lean-product-management-and-design-thinking/?utm_content=bufferb9b9f&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer​

1 Comment

Data as a Critical Element in the Discovery and Delivery of Smart Energy

4/23/2017

1 Comment

 
By 2020 Gartner analysts estimate an astronomical number of 25 billion devices will be connected to the internet. Dubbed as the internet of things (IoT), a sizeable portion will be sensors. In the energy and utilities industry, these sensor devices will comprise of meters, gauges, and thermostats attached to energy turbines, oil drills, smart home appliances, solar panels, power stations, and windmills—all as part of a smart grid. They will act like incessant, inaudible chattering sentinels spewing electrical impulses as data streams.
In this savannah of raw data streams are invaluable insights. By using machine learning techniques to implement predictive statistical models, data scientists can discover valuable insights and illuminative patterns from these data streams. Utility companies can use these results to deliver efficiency energy usage, leading to smart decisions and cost savings.
Without using machine learning predictive models and real-time detection of defective devices, you cannot discover trends. Without discerning trends you cannot deliver energy 


The Future is here for Smart Grid
Yet you don’t have to wait until 2020. Today, utility companies are employing Apache Spark and MLlib algorithms to tap into sensor data to capitalize on savings. As well building predictive models, managing and optimizing assets, and monitoring the general health of the sensors and grid, data scientists at energy companies can predict potential faults and prevent failures in their electrical grids by garnering, extracting, and examining data from various sources, as depicted in the diagram.
Picture
Harnessing the power of machine learning
Thomas W. Dinsmore explains what machine learning (ML) can do for your business. By citing various use cases, he examines how ML algorithms can detect cancer pathology better than trained human reviewers; classify paintings into historical genres quicker than art curators; detect fraudulent credit card transactions faster than humans; and recognize voice and image rapidly than we do, across vast array of images and audio files.
However, these ML techniques can be extended to other sectors, argues Bruce Harpham in “How data science is changing the energy industry.” Harpham suggests that innovation and use of ML techniques borrowed from other sectors that Dinsmore above refers to can be transferred to the energy sector. He elaborates on how energy sector is already building smart saving grids.
For example, Francisco Sanchez, president of Houston Energy Data Science, says that “The energy industry has recently started to adopt the survival analysis concept from the medical field.” Used in medicine, the survival analysis is a statistical method to predict survival rates for patients based on their existing conditions and treatment program.
This model is now being applied to predict field equipment health in the oil and gas industry, Harpham writes, enabling early detection of defective equipment, preventing failures and outages, and improving reliability of energy consumption.
Another case in which machine learning techniques that defeated Chess and Go masters are being used to conduct, coordinate and conserve excessive energy for flexibility and demand-site capacity. Michael Bironneau, head of technology development at Open Energi, discusses how these ML techniques are employed to create smart energy grids.
In particular, Bironneau explains that one of the benefits of ML techniques and algorithms comes from “unlocking and utilizing flexibility” during high and low energy demands. Through data analytics from the sensors, such algorithms engender ways to redirect or reschedule power consumption, hence balancing the grid and minimizing waste of energy.
But the key here, Bironneau emphasizes, is that with “sufficient [harnessed] data, a ML model can look at a sequence of actions leading to rescheduling of power consumption and making grid-scale predictions.”
All this indicates that ML techniques, used elsewhere in predictive analytics for medicine, customer churn in the retail sector or defeating strategic games, can be applied toward building smart energy grids. Using sensor data from drills, energy turbines, smart house appliances, and windmills, data science and ML can be used to construct smart energy grids, and through data discovery, it can deliver smart energy.
Two Uses Cases of Machine Learning in Smart Energy Grids
The unstoppable rise in connected devices, and the volume of data generated, has engendered a demand for advanced analytics using machine learning techniques to mine data. Whether for predicting, monitoring, or identifying patterns and trends of energy usage, some utility companies have embraced this imperative to capitalize on three aspects of advanced analytics to gain real advantage: Disruptive energy insights; the digital customer; and the energy integrator.
Let’s look at two energy utility companies that incorporate data analytics in their smart grid.
Consider DNV GL, a consulting firm, guides energy, maritime, and oil and gas clients to build safer smarter, and greener grids. Using Apache Spark MLlib, the firm has developed machine learning models and analytics services to analyse massive amounts of sensor data, combined with weather data, to predict, gauge, and temper energy demands, both for natural gas and electric usage.

​http://insidebigdata.com/2017/04/21/data-critical-element-discovery-delivery-smart-energy/

1 Comment

Is Cognitive Technology the End of Marketing As We Know It?

4/22/2017

2 Comments

 
Picture
“Will artificial intelligence replace marketers in the near future?”
This is the compelling question posted by Loren McDonald of IBM Watson Marketing during his presentation at the recent Digital Summit conference in Los Angeles. While many marketers might consider this a provocative presentation opener, there are some blunt realities marketers need to consider if they want to remain in the field and be competitive.
Consider these stats:
  • ‘Intelligent agents’ or AI will destroy 6% of all jobs in the US by 2021. Forrester Research
  • AI could threaten up to 47% of jobs in two decades. Eric Berger, ars Technica1
So what does artificial intelligence (AI) and machine learning mean, anyway?Artificial Intelligence is about the development of computers systems that are able to perform tasks that would normally require human intelligences such as visual identification speech recognition, decision-making and translating between languages. AI performs a role in many of the stems that you use everyday from using Siri on your phone, a chatbot on an ecommerce site like Staples or 1-800-Flowers or every time you use Google.
Machine learning is a subset of AI that allows computers to learn much the same way that people do, only faster and without being explicitly programmed for every task that they can complete.
“In economics, things take longer to happen than you think they will, and then they happen faster than you thought they could.” Rudi Dornbusch, German EconomistWith the oncoming ubiquity of AI in our everyday lives, you have to wonder where that trend will intersect with marketing. Loren considered whether marketers will be out of jobs in 10-15 years like Uber & Lyft drivers are expected to be. It’s a reasonable question to consider.
I was able to see a demo of IBM Watson’s Cognitive Technology for marketing at the World of Watson conference and the possibilities were impressive. Outside of considering all the ways AI and machine learning could help with extracting insight from large amounts of data and with ongoing campaign optimization, my big takeaway from the demo was that as with all industries that change, those that adapt will survive and thrive. Those that don’t, won’t.


Things like PPC, social ads and any other kind of online advertising would be ripe for AI. Another immediate and practical example of how AI and machine learning could help marketers is email subject line writing and testing.  Loren suggested that you could use a tool like Phrasee, which can learn from your customer response metrics and then use machine learning to quantify and optimize language for you to use that will best engage your audience.
Thank about all the structured, repetitive and rules based tasks you might do on a regular basis as part of your job as a marketer. They are all open to being completed by an AI service. Not only could they be completed by a computer, but they could be done faster and with fewer errors. That could free marketers up to spend time on managing even more programs without additional staff.
Now, if you’re wondering what roles and tasks are at risk, Loren shared this list:
  • Easily repeatable
  • Data-centric
  • Tasks that improve with learning
  • Rules drive tasks
  • Reporting
  • Customer and segment analysis
  • Campaign automation
  • Media buying
  • Campaign testing
It can certainly cause some tension to think that your job might be replaced by a computer, but Loren suggested that the solution to this impending automation of what many marketers do, is to adopt a “center brain” marketing approach.
Loren says that center-brain marketing melds right brain creativity with left brain analytical thinking with technology to fuel success in a future driven by machine learning.
Traditionally, marketing has been viewed very much as “right brain” and creative. But left brain analytical marketing has been growing fast and most marketing organizations already include a mix of both.
I know within our own agency at TopRank Marketing, we’ve been using analytics and various data types to optimize marketing programs for years. After attending IBMs WoW conference, I’ve been salivating over what one could do with bluemix access to Watson smarts for content recommendations, influencer analysis and finding many interesting correlations to help us provide better recommendations to clients. More on that in the future.
As Loren mentioned in his presentation, the shift towards cognitive will be accelerated as marketing becomes more dominated by left-brain people using machine learning and artificial intelligence for marketing decisions, targeting, creative and conversion optimization. Technologies like IBM’s Watson cognitive marketing tools will help marketers deliver more relevant content and offers at the right time than humans alone ever could.
Ultimately, Loren decided the answer to the question about whether cognitive technology will be the end of marketers and marketing as we know it should be answered in terms of what’s happening with driverless cars and the notion of level 2-4 autonomy.
Level 0 – Human only
Level 1 – Cruise control
Level 2 – Tesla Autopilot
Level 3 – The car makes decisions
Level 4 – Human as back-up
Level 5 – No human involved
You can see that there will be degrees of AI implementation, but it’s not an all or none situation. There will still be human powered marketing assisted with technology along with partially and fully automated marketing programs based on goals.
As pressures to scale competitive marketing programs increases alongside growing competition, it is inevitable that cognitive will become a normal part of marketing. The question is, what are your plans as an organization and as an individual to acquire the knowledge, skills and perspective to stay ahead of the game?
Do you think artificial intelligence and cognitive technology will replace part or all of your job? What are you doing to adapt?
You can connect with Loren McDonald on Twitter: @LorenMcDonald and LinkedIn.
This is the second of two posts from the Digital Summit Los Angeles conference that I’m posting this week. Be sure to take a look at the first one featuring Serena Ehrlich from Business Wire where I summarized her advice on search and social media promotion of news release content.

​Source: http://www.toprankblog.com/2017/04/cognitive-technology-marketing/


2 Comments

Making predictions with Big Data

4/21/2017

2 Comments

 
Technology is playing a ubiquitous role in our daily lives—whether it’s policing a city, speeding up financial transactions or transforming supply chains
Picture
Companies are sharpening their focus on analysing this deluge of data to understand consumer behaviour patterns. Photo: iStockAt first glance, the letter from the Delhi police commissioner’s desk could have easily been dismissed as another routine laundry list of his department’s “achievements” in the previous year.
A closer look at the letter, written a little over two years ago, would have sprung a pleasant surprise in the context of the city police’s technology prowess.
The Delhi Police, according to the letter, had partnered with the Indian Space Research Organisation to implement CMAPS—Crime Mapping, Analytics and Predictive System—under the “Effective use of Space Technology-based Tools for Internal Security Scheme” initiated by Prime Minister Narendra Modi in 2014.
CMAPS generates crime-reporting queries and has the capacity to identify crime hotspots by auto sweep on the Dial 100 database every 1-3 minutes, replacing a Delhi Police crime-mapping tool that involved manual gathering of data every 15 days. It performs trend analysis, compiles crime and criminal profiles and analyses the behaviour of suspected offenders—all with accompanying graphics. CMAPs also has a security module for VIP threat rating, based on vulnerability of the potential target and the security deployed, and advanced predictive analysis, among other features.
A prototype of the standalone version was installed at the Delhi Police control room in June 2015. The software’s statistical models and algorithms today help the police carry out “predictive policing” to forecast where the next crime is likely to occur, much like in cities such as London, Los Angeles, Kent and Berlin.


That’s just one example of how technology is playing a ubiquitous role in our daily lives—whether it’s policing a city, speeding up financial transactions or transforming supply chains.
Fintech start-up Lendingkart Technologies has developed tools based on big data analytics to help lenders evaluate borrowers’ creditworthiness. Using these tools, its sister company Lendingkart Finance Ltd aims to transform small business lending by providing easy access to credit for small and medium enterprises.
The “technology platform has helped create a highly operational efficiency model that enables swift loan disbursement within 72 hours of loan application. Over 120,000 SMEs (small and medium-sized enterprises) have till date reached out to Lendingkart Finance for their credit needs,” the company said.
Accenture Labs and Akshaya Patra, the world’s largest NGO-run midday meal programme, said on Thursday that they had partnered in a project to “exponentially increase the number of meals served to children in schools in India that are run and aided by the government”.
Using “disruptive technology”, they hope to potentially “improve efficiency by 20%, which could boost the number of meals served by millions”.
Accenture Labs began the project with a “strategic assessment and design thinking, then developed a prototype for improving kitchen operations and outcomes”. An example of Akshaya Patra’s transformation, according to Thursday’s statement, was its move “from manual collection of feedback from children and schools to a more efficient technology-based solution” that involved the use of blockchain (the underlying technology of cryptocurrencies like bitcoin) and sensor-enabled devices to gather feedback digitally, and use artificial intelligence (AI) technologies to “predict the next day’s meal requirements”.
Consider another example. Until even early 2015, the thousands of distributors of consumer goods firm Marico Ltd in Mumbai used to place orders and wait “almost a day” before getting the goods delivered. Now it takes just 10-15 minutes for an order to be delivered, helping them stock fewer goods. In turn, the lower inventory helps them cut down on warehouse space and pare costs, besides reducing the waiting time for trucks. All these distributors have benefited from an analytics-driven Order Management Execution System that the company launched in December 2014.
What exactly is big data analytics?Big Data and the so-called Internet of Things (IoT) are intimately connected: billions of Internet-connected “things” will, by definition, generate massive amounts of data. By 2020, the world would have generated around 40 zettabytes of data, or 5,127 gigabytes per individual, according to an estimate by research firm International Data Corp. It’s no wonder that in 2006, market researcher Clive Humby declared data to be “the new oil”.
Companies are sharpening their focus on analysing this deluge of data to understand consumer behaviour patterns. A report by software body Nasscom and Blueocean Market Intelligence, a global analytics and insight provider, predicts that the Indian analytics market will cross the $2 billion mark by this fiscal year.
Companies are using Big Data analytics for everything—driving growth, reducing costs, improving operations and recruiting better people.


In hospitals, intelligence derived from data helps improve patient care through quicker and more accurate diagnoses

A major portion of orders of e-commerce firms now come through their analytics-driven systems. These firms record the purchasing behaviour of buyers and customize things for them. Travel firms, on their part, use data analytics to understand their customers—from basic things like their travel patterns, the kind of hotels they like to stay in, who their typical co-travellers are, their experiences—all geared to giving the customer a personalized experience the next time the customer visits the website.
In hospitals, intelligence derived from data helps improve patient care through quicker and more accurate diagnoses, drug dosage recommendations and the prediction of potential side effects. Millions of electronic medical records, public health data and claims records are being analysed.
Predictive healthcare using wearables to check vital medical signs and remote diagnostics could cut patient waiting times, according to a 13 January report by the McKinsey Global Institute. International Business Machines Corp.’s Watson, a cluster of computers that combines artificial intelligence and advanced analytics software and works as a “question answering” system, is being used for a variety of applications, most notably in oncology, the branch of medicine that deals with cancer. Watson for Oncology helps physicians quickly identify key information in a patient’s medical records, sift through tons of data and come up with most optimal medical choices.
Many companies globally and in India, including some start-ups, are using machine-learning tools to infuse intelligence in their business by using predictive models. Popular machine-learning applications include Google’s self-driving car, online recommendations from e-commerce companies such as Amazon and Flipkart, online dating service Tinder and streaming video platform Netflix.
Railigent, Siemens AG’s platform for the predictive maintenance for trains, listens to the trains running over its sensors and can detect, from the sound of the wheels, which wheel is broken and when it should be replaced.
Predictive algorithms are used in recruitment too. Aspiring Minds, for instance, uses algorithms powered by machine learning that draw on data to address complex issues—for instance, to accurately gauge the quality of speech in various accents against a neutral accent (also using natural language processing). This helps companies improve recruitment efficiency by over 35% and reduce voice evaluation costs by 55%.
Artificial intelligence, machine-learning-based algorithms and anomaly-detection techniques will need to be used to monitor activity across networks and real-time data streams, consulting firms point out. These technologies will, for instance, let banks in India identify threats as they occur while maintaining low false positive alarm rates even for new types of threats.
There are still challenges in bringing about wider technology adoption.
“Our survey showed that only about 4% of companies across industries have the capabilities to use advanced data analytics to deliver tangible business value. While some oil and gas companies have invested in their analytics capabilities, many struggle to get their arms around this powerful new opportunity,” said a March 2014 note by Bain and Co..
“We often find that senior executives understand the concepts around Big Data and advanced analytics, but their teams have difficulty defining the path to value creation and the implications for technology strategy, operating model and organization. Too often, companies delegate the task of capturing value from better analytics to the IT department, as a technology project,” the note pointed out.
In the 2006 movie Deja Vu, law enforcement agents investigate an explosion on a ferry that kills over 500 people, including a large group of party-going sailors. They use a new program that uses satellite technology to look back in time for four-and-a-half days—to try to capture the terrorist.
Predictive policing is surely not as advanced today. And advances in predictive analytics can certainly raise ethical issues. For instance, the police may in the future be able to predict who might become a serial offender, and make an intervention at an early stage to change the path followed by the person, as is the case in Deja Vu. Or an insurance firm may use predictions to increase the premium or even deny a user an insurance.
Any disruptive technology needs checks and balances in the form of good policy if it is to deliver to its potential.

​Source: http://www.livemint.com/Industry/KUE7JGODJlGgYdmQ2VX5KM/Making-predictions-with-Big-Data.html

2 Comments

Why big data requires marketers to think small?

4/20/2017

1 Comment

 
Picture
Even in terms of definition, there’s some disagreement regarding the power and potential value of big data for marketers.
Google big data and you’ll find it defined as:
 “… extremely large data sets that may be analysed computationally to reveal patterns, trends and associations, especially relating to human behaviour and interactions”
Forbes will tell you it’s:
 “…technology (which includes tools and processes) that an organisation requires to handle large amounts of data and storage facilities.”
On the other hand, Webopedia will tell you big data is:
 “… a term for data sets that are so large or complex that traditional data processing softwares are inadequate to deal with them, leading to challenges of data capture, storage, analysis, curation, search, sharing, transfer, visualization, querying, updating and privacy.”
So, big data is essentially a lot of data that marketers can analyse using innovative techniques to give marketing insight and power.
Or perhaps it isn’t.  Perhaps there’s actually so much data that it cannot be used in a constructive way.
Information overload?Even if we do accept that it can be utilised within marketing, some argue big data is no more than a sophisticated and complicated way of recreating the past.
Phenomena such as ‘mirroring’, or ‘echoing’, where consumers’ previous behaviour shapes how we market to them in the future, risks stifling innovation and preventing disruptive marketing.
Indeed, we are increasingly beginning to hear of major (and expensive) data-led marketing programmes whose sophisticated in-built metrics reveal - in uncomfortably precise detail - a remarkable lack of effect. There seems to be a growing body of opinion that maybe Big data isn’t quite the claimed panacea bestowing huge insight and power to the marketing professional.
Within the marketing community the debate seems to be moving back to whether big data techniques represent a brave new world or a massive con trick.
The benefits of thinking smallI’d like to suggest a couple of more useful ways of considering big data. Both are pretty obvious, and both have been previously discussed. 
But as debate has moved on, they seem in danger of being forgotten.
Think in terms of individuals
You can measure the aggregate behaviour of an unimaginably large crowd, but you must understand the individual movements of each person in the crowd for the data to be of any use.
Measuring collective behaviour makes for accurate history but you need to understand individual motives to shape your future.
Think in terms of clues
The second point is slightly counter-intuitive, and regards how we see the data itself.
It may be intelligent for marketing not to consider big data as ‘facts’.
Instead, we should think of it as clues. Clues about patterns in what people are doing, what they’re thinking and what they’re feeling.  And then more clues about influences: what media people are accessing, how they’re receiving and interpreting messages and how they’re changing their minds.
But the thing about clues is that they don’t solve themselves. Clues need to be read, interpreted and understood, and that takes imagination.
And there lies the heart of value of big data and its usefulness as a marketing tool.
Interpretation is everythingIt’s unarguably true that big data sources and technologies are hugely powerful in examining markets, in planning how to address audiences with more effective and efficient targeting, and within evaluation and effectiveness assessment regimes. However, while conducted mechanically, with data leading marketing increasingly sophisticated marketing tools lead to increasingly crude and clumsy marketing programmes.
Big data sources and big data techniques will not remove the need for imagination and creativity in marketing. They will raise the bar.
And so, the real challenge to the marketing community may be how to recast the ‘traditional’ marketing skills of imagination and creativity to effectively manage the potential of big data sources and technologies.
A few different models seem to be evolving:
Process solutions
Formalising a series of ‘what if?’ visualisation pauses through the course of marketing development programmes appears to offer potential in terms of ensuring data is continually used with, rather than replacing, insight. 
Team solutions 
A number of new tech organisations have adopted the route of creating a discrete function of data commentators, analogous to rally car navigators, within marketing teams with the role throughout the process of monitoring implications of data in the context of a clearly defined longer term strategy and unchanging end objective.
Management solutions
Several large scale traditional organisations have, over the past 18 months, responded to opportunities and challenges in this area by establishing a separated Executive Board role of Chief Intelligence Officer.
Any approach adopted will of course reflect the organisation and task in question.  However, to succeed, one principle may need to underpin any approach:
Key pointsBig data can only record the past:  Draw inferences, don’t take dictation
Big data measures the aggregate — we must understand the individual:  Think small
Big data won’t provide a ready-made answer or do your thinking for you:  Look for clues
Above all, big data is complex and nuanced.  Effective management and direction demands ultimate simplicity and clarity in strategy: keep it simple.

​Source: https://www.marketingtechnews.net/news/2017/mar/23/why-big-data-requires-marketers-think-small/

1 Comment

5 Ways Big Data Analytics Can Help Your eCommerce Business

4/19/2017

38 Comments

 
Picture
​The words ‘Big data’ are thrown around a lot these days, but there is no definition that is universally accepted. The best definition of Big data comes from analyst Doug Laney, who said in 2001 that Big data is defined by ‘The 3Vs’ – including velocity, variety and volume. This means that Big data is a large amount of content that is varied and being produced quickly. Here are five ways that Big data analytics can help your online company.









​

1. Examine Google Trends
Big data analytics can help your business by giving you an opportunity to examine trends on Google. Trend data shows you what kind of terms and keywords have been searched, where they were searched and who they were searched by. “This information helps you see what the public is interested in, and allows you to adapt to that specific market. Trends can also help you decide the best direction for your website”, – says Jane Reed, Operation Manager at Paper Fellows.
2. Prevent Fraud
Through the analysis of large data sets, you can identify where different kinds of fraud are most prevalent. For instance, you can find out what states or what countries that credit card fraud is most common, or where cash-on-delivery commitments are not honored. This allows you to take steps to ensure that you are not a victim of fraud, by implementing anti-fraud measures in specific areas, or avoiding doing business in these areas totally.
3. Introduce New Products
If your company is in the business of developing and creating products based on new trends, Big data analytics allows you to examine new and upcoming trends. When you use data collected from search engines, social media, surveys, forums and other online networks, you can learn more about what your customers might want.
Carol wise, Big Data Specialist at Boomessays comments: “By filtering data by characteristics of customer, you can find out what your target market is interested in, and discover what kinds of new products they may enjoy”.
4. Improve Your Customer Service
As well as information about what kinds of products your customers or target audience may be interested in purchasing, you can use Big data to find out what kind of services your target audience may be interested in.
“Using data on each customers, as opposed to noticing general patterns, you can learn extremely valuable things about your customer service. Reading the feedback from customers achieved through surveys, you can learn about problems within your company that are causing a significant problem”, – shares Carol Wise, Data Analyst at Essayroo.
If only three of your customers experience the same problem, it does not form a pattern, but it does mean that you can lose those three customers. For this reasons, Big data is valuable both at an individual customer level, and on a general level.
5. Make Big Predictions
Big data analysis makes it possible for you to obtain an in-depth look at the many different channels in your company, from inventory to sales. You can, for instance, use big data analysis to look at the time of year in which certain products of yours sell best. You can then take this information and make sure that you have enough stock available to make the most of this surge in interest.
By using big data to make predictions, you can ensure that your business is always ready for seasonal surges, and make the most out of the products you sell.
Spend Time with Big Data
Big data might seem confusing and complex at first, but once you spend some time learning how to analyze trends, the benefits become quite obvious. Spend some time exploring big data and your business will soon see the benefits.

38 Comments
<<Previous

    Archives

    April 2017
    March 2017
    February 2017

    Categories

    All

    RSS Feed

LOCATIONS

New Brunswick, New Jersey
San Francisco Location, California
Online or on-site at your location

Contact us

Lisa Doehring
Program Manager
ldoering@cx.rutgers.edu
415 343 0264

BIGDAta@RUTGERS

The Big Data Certificate Program at Rutgers University is offered in connection with the Rutgers Center for Innovation Education and the Rutgers Professional Science Master's Degree.

SUBSCRIBE

Join our mailing list today!
Join Now