Rutgers Big Data Certificate Program

The Relationship Between Big Data and IoT

3/31/2017

7 Comments

 
Why Big Data and Big Data analytics are crucial to how the Internet of Things works and grows.
Picture
“IoT is the senses, Big Data is the fuel, and artificial Intelligence is the brain to realize the future of a smart connected world.”
IoT is about devices, data and connectivity. The real value of Internet of Things is about creating smarter products, delivering intelligent insights and providing new business outcomes. As millions of devices get connected, internet of things will trigger a massive inflow of Big Data. The key challenge is visualising and uncovering insights from various types of data (structured, unstructured, images, contextual, dark data, real-time) and in context of your applications.   I believe deriving intelligence from Big Data using artificial Intelligence technologies is the key enabler for smarter devices and a  connected world.
The end goal is to harness the data coming from sensors and other contextual information to discover patterns and correlations in real-time to positively impact businesses. Existing Big Data technologies need to be augmented to effectively store, manage and extract value from continuous streams of sensor data. For instance, it is estimated that connected cars will send 25 gigabytes of data to the cloud every hour. The biggest challenge will be to make sense of this data, identifying data that can be consumed and quickly acted upon to derive actionable events.
The use of Artificial Intelligence technologies like deep learning will be a key differentiator to derive insights rapidly from massive streams of data. With IoT, Big Data analytics would also move at the edge for real-time decision making, for instance detecting crop patterns in agriculture plants using drones at remote places,  detecting suspicious activities at ATMs or predicting driver behavior for a connected car.
To sum it up, IoT is the senses, Big Data is the fuel, and Artificial Intelligence is the brain to realize the future of a smart connected world.

Source:  ​https://dzone.com/articles/the-relationship-between-big-data-and-iot
7 Comments

How big data analytics in sport supports the experts

3/30/2017

5 Comments

 
For sports performance specialists, there has been a convergence of technology that has allowed the collection and analysis of sporting metrics like never before.
Picture
Last year’s Rio Olympic and Paralympic games saw new levels of performance from the world’s elite athletes and new and emerging technology has had a pivotal role in this success. 
While the latest hydrodynamic bodysuit for swimmers and exactly what Usain Bolt eats for breakfast might be headline grabbing, the gear that is really making the difference for athletes is the wearable tech transmitting real-time information about performance and the software capturing and analysing this. 
“I’ve always believed analytics was crap ” 
So said Charles Barkley, one of the greatest basketball players ever to feature in the NBA. 
But for all his achievements, Sir Charles’ views are not shared by the top athletes of today. 
“One of the most significant developments in technology has been data. The speed at which data can be converted into athletic training is incredible, and what used to require several hours in a lab can be taken in the velodrome as you train. For me, harnessing instant and accurate data couldn’t have gone better. It helped me get in tune with my body and performance to get everything right on race day, and it helped me walk out of Rio2016 with three Paralympic medals.”
That’s the view of Steve Bate, one of the many British Paralympic and Olympic heroes who can testify to how wearable tech and big data analytics is giving them a competitive edge in their sport. 
In January 2017, Steve Bate and other leading figures in sports and sports analytics came together for InTheCity at a high-profile event at Manchester Town Hall to look at technology and analytics in sport. What was the impact? What difference was it making? And was it taking the “soul” out of sport?
History of analytics in sportFew sports are quite as rich in data as Major League Baseball. For the uninitiated, a BA of .321, an SLG of .612 and an RBI of 123 over a season might seem like random numbers. But for the baseball insider fan these, and others, can tell them everything they need to know about a player’s form and potential. 
Each sport has its own set of statistical markers which coaches pour over to find out where athletes are excelling and where they’re flagging. 
Until recently though, these statistics had to be both collected and analysed manually. Athletes would have to train in laboratory conditions, covered in wires. Despite the best technology of the time, this could not accurately replicate competitive field conditions. 
However, the simultaneous development of both wearable tech and big data analytics has driven analytics capture and analysis into the digital age. 
Wearable technology allows real-time collection and transmission of a wealth of performance critical metrics such as heart rate, blood pressure, respiratory performance, muscle performance and speed. 
These metrics are only as useful, though, as the ability to make sense of them. This has been the toughest nut to crack, and it’s where big data analytics comes in. 
Through analytics, the raw data that comes from the individual athlete or team can be examined at a granular detail to better understand where improvements can be made to that athlete’s technique or training schedule. 
It’s helping prevent injuries too. Rugby union teams now use data monitoring and analytics to watch for and prevent injuries.  It’s making dangerous sports less dangerous without impacting on the spectacle.
In fact, real-time analytics have the power to enhance spectator enjoyment. Looking back at our baseball example, sports fans are unashamed stats nerds and the more they can get access to, the happier they will be. And imagine being able to see the heart-rate of Cristiano Ronaldo as he steps up to take a free kick or Joe Root as he faces a 90 MPH delivery from Morne Morkel. 
It would give the spectator a unique insight into the experience of the athlete like never before. 
What sports insiders thinkAt the January event in Manchester Town hall, Steve Bate and Steve Flynn, Director of GB Taekwondo spoke about the impact wearable tech and analytics made to their sports. Bate spoke about the wearable tech made a huge difference to his training:
 “We can do real time out on the road, we don’t need to be in a lab any more with masks on and things like that because actually with having power and heart rates and stuff we can do a lot of that”
Of course, the data still needs analysing and Bate made it clear how this had given him a competitive edge:
 “The helmet, the skin suit, the socks you wear, really small, minor gains, marginal gains, but actually we’ve learnt that, certainly in the last six months, that the biggest gain you get is your physical performance. And the fitter you are, the faster you’re going to go.”
Steve Flynn dropped an anecdote from the Beijing Olympics which demonstrated the impact of tech on his sport. 
“At the Beijing Olympics, technology identified a winning move where referees did not, ultimately leading to a rule change. This was a pivotal moment in Taekwondo, and we now use electronic sensors to track hits and remove the potential for controversial referee decisions.”
For Flynn, though, this is just the beginning and he looked to how wearable tech and analytics could give coaches on the spot info about how their athletes were coping under the highest pressure that could never be recreated in training. 
“Wearables, smart technology like that, would make life a lot easier for us…In the Olympic final and the chips are down – how will you react to it? I’d love to know what Lutalo’s (Muhammad, Team GB Taekwondo silver medallist at Rio 2016) heart was doing when he had a second to go in his Olympic final and he thought he’d got the kick and he hadn’t, and he lost his go. Was he full of adrenaline? Or had he relaxed to the point where he thought, ‘I’m done’?’ “
And that’s where you really want to know what’s going on.”


ConclusionFor sports performance specialists, there has been a convergence of technology that has allowed the collection and analysis of sporting metrics like never before. Wearable technology allows the collection and transmission of data in real-life circumstances. Out on the road or velodrome on a cycle and in the taekwondo dojang, big data analytical power allows coaches and sports scientists to truly understand and improve performance. 
As wearable technology continues to develop, allowing for even more collection of metrics, the information given to analysts will be more accurate and more reflective of what is happening to athletes where it counts; on the field of play. 

​Source: http://www.itproportal.com/features/how-big-data-analytics-in-sport-supports-the-experts/

5 Comments

New enterprise platform streamlines getting business intelligence from big data

3/29/2017

2 Comments

 
Picture
Although big data is currently all the rage, extracting meaningful business intelligence from it can prove costly and time consuming.
Data acceleration company Jethro is launching its latest platform offering an all-in-one enterprise solution that combines the power of indexing architecture with 'auto-cubes' to accelerate extracting business intelligence from big data.
Jethro 3.0 eliminates costly and labor-intensive data engineering tasks such as pre-aggregating tables, manually building cubes, or keeping up with new and changing applications. Instead, Jethro automatically creates cubes based on real-world user queries, fully-indexes all table columns, and manages an intelligent query result cache.
Unlike SQL-on-Hadoop engines that full-scan billions of rows of data for every query, Jethro makes use of its indexes, cubes and cache to process queries with much less effort and greater speed. It delivers consistently fast performance for any BI query, on any size dataset, and with any number of concurrent users. Users can interact with their BI dashboards and generate actionable business insights much more quickly than with other systems.
Jethro 3.0 also comes with improved enterprise security features including Lightweight Directory Access Protocol (LDAP) authentication and role-based permissions, allowing customers to set clearly-defined security responsibilities within their own company. In addition it offers the ability to directly load data from Hadoop tables and an improved management GUI.
"Today's approach to BI on Big Data is not working. Under the SQL-on-Hadoop hype lies monumental failure rates with existing approaches," says Eli Singer, CEO of Jethro. "With a purposely built tool like Jethro, which leverages a combined automation and acceleration architecture, 3.0 provides high-performing enterprise BI at lower Big Data costs. Nobody else makes existing BI applications work on big data like Jethro."
You can read more and view a live demo of how it works on the Jethro website.

​Source: https://betanews.com/2017/03/29/business-intelligence-big-data/

2 Comments

The Age of Everything: The power of big data and IoT

3/28/2017

1 Comment

 
The fundamental goal of enhancing technology is to make living more convenient, safer and efficient.
Picture
In the world of big budget science fiction movies, an integrated society where everything is connected, talking to each other, responding, learning and forming decisions is something that you are familiar with. However, many of these fantasy creations are not far from turning into a reality. In fact, it is much closer than we believe.  
The internet has revolutionised a lot of aspects in our lives; it has not only made several things accessible, but also changed the way we do many things. IoT or the Internet of Things is beyond just connectivity. It is a connection that is mobile, virtual and instantaneous and which is going to make things in our lives ‘smart’. IoT, coined by Kevin Ashton, revolves around increased machine-to-machine communication built on cloud computing and network of sensors that gather data. According to Forbes research, out of all the other technology trends that are taking place right now, IoT is believed to be the biggest and the most disruptive one. It is also believed that the IoT will give the highest number of business opportunities in the coming years.
What is IoT and why it is so importantThe fundamental goal of enhancing technology is to make living more convenient, safer and efficient. IoT refers to scenarios where network connectivity and computing capability extends to devices that we use in our daily lives allowing them to generate, exchange and consume data with minimal human intervention. The captivating part for consumers and enterprises is that anything that can connect will be connected. Though combining and controlling of devices has existed for decades, few trends like ubiquitous connectivity, adoption of IP based networking, computing economics, miniaturisation, advances in data analytics and rise in cloud computing has brought IoT much closer to reality. 
Value can be extracted from big data only when it is processed and analysed in a timely manner in which the results are made available in such a way as to be able to influence business decisions. IoT is driving new interest in big data, by generating enormous amount of new types of data that is stored, processed and accessed. The value IoT creates is at the intersection of data gathering and leveraging on the data collected. Cloud based applications are the key to using leveraged data as these applications interpret and transmit data coming in from various sources. 
With big data analytics taking form as a crucial tool in the galaxy of networking and computing, enterprises around the world have understood its significance. The reason for this is that big data is a useless chunk of digital signals if you are unable to process it and deliver sharper insights to users. This is where data analytics comes into the picture; the nuts and bolts of the operation are already in place with business opportunities worth trillions of dollars being forecasted.
How many IoT devices and how much data is can be generated? ‘Massive’ is the only way to put it. By the year 2020, EMC and IDC forecast that IoT devices will reach 32 billion while Gartner is going with the more conservative estimate of 26 billion. The associated surge in big data stands at 44 zettabytes - that’s 44 trillion gigabytes for the uninitiated. IoT and big data have the potential to shape and change the way we do business and the way we live our lives in the future.
Predictions and challenges of IoT and big dataOver the next five years, enterprises are set to focus on the adoption of IoT enabled devices, its technology and infrastructure. In line with this, organisations have already channelized their investments and have put in place advanced computing departments to make their businesses ‘smart’. These departments have started developing systems and designing processes that has allowed new opportunities for talent to develop and jobs to be created. Consumer convenience is the primary focus for most consumer businesses. Right from smart home locks to smart electronic appliances are finding place in the market with its ease of usage and utilisation. 
Despite organisations pushing hard for the increased adoption of IoT devices, cybersecurity is the top impediment to IoT deployments. There are wide spread concerns on the security of products and their ability to offer protection to the data they are collecting. For example, in the healthcare industry, wearable tech is used to track patient health remotely but a breach in this data transfer can attract penalties from the FTC. These risks are common when it comes to these types of innovations, especially on such a colossal scale. Hackers not only use IoT products as an access point to enter the network but also cause physical damage to them. Currently there is an inherent lack of security within IoT devices because vendors continue to release products with little or rather no defence against cyberattacks. 
The road aheadThe IoT and big data are two of the most talked about technology topics that are intimately related. Both share a closely knitted future and the two fields will create new opportunities and solutions which will have a long-lasting impact on the enterprise level.
The Internet of Things will become a much more diverse network in the future where its end points will not be constrained only to certain areas, but will span all arenas, and emerge as the most important big data analytics cloud. For organisations looking to make it big in this field, it’s crucial to have a standardisation of the architecture, provide better safety features and hone in on the benefits of big data analytics.

​Source: http://www.itproportal.com/features/the-age-of-everything-the-power-of-big-data-and-iot/

1 Comment

They’re no Palantir: smaller startups with new ideas take on big data

3/27/2017

0 Comments

 
Picture
Delivering an eBay order in under 30 minutes, mining 11 million financial documents for evidence of illegal activity, helping humans reach Mars more quickly — three seemingly unrelated feats made possible using a new data analysis tool that is sweeping Silicon Valley.
Forget spreadsheets. It’s all about graph databases, which map information using an intricate web of connections between data points. Looking at data that way, an idea popularized by industry leaders like Facebook and Google, lets users spot relationships that otherwise might be missed. Experts say it is helping revolutionize the field of data as the world is flooded with more information than ever before.
“Data does feel like the new oil. It’s kind of the commodity that makes everything go,” said Zavain Dar, a principal at Menlo Park-based venture capital firm Lux Capital. “And it’s really on the enterprise now to have as fine-tuned of an engine as possible.”
Collecting the data is no longer the hard part — it’s sucked up when you search and post online, and gathered by everything from autonomous cars, to satellites, to smartphones, and funneled to companies or government organizations. Tech companies’ challenge now is figuring out how best to analyze that data.
Silicon Valley startups are attacking that problem, a trend that experts say could ultimately be problematic for incumbents of the big-data industry — such as Palantir — which risk losing some of their market share to the newcomers.
Neo Technology, a San Mateo-based startup named after the main character in The Matrix movie trilogy, is one of the early pioneers of graph database technology. Previously, that type of data analysis was reserved for companies with big wallets and deep talent pools, like Facebook and Google. Neo Founder and CEO Emil Eifrem says his team offers its own version of that “little piece of Silicon Valley magic” to the masses.
For example, last year the International Consortium of Investigative Journalists used Neo’s technology to dig through the more than 11 million records revealed in the Panama Papers leak, searching for evidence of corrupt offshore tax activity.
NASA uses Neo technology to manage the space agency’s database of more than 10 million documents detailing lessons learned from past mission failures and successes. Using Neo, searches in the database take less time — days instead of weeks or months — and return more relevant results, said David Meza, chief knowledge architect at NASA. He said Neo helped one NASA engineer save millions of dollars and up to two years by locating existing research he could use in his work on the Orion, the spacecraft NASA hopes eventually will take humans to Mars.
Closer to home, eBay engineer Volker Pacher demonstrated another use for Neo’s technology. Standing onstage at Neo’s 2014 graph database conference, Pacher ordered a bottle of bourbon on eBay. As he finished the talk half an hour later, a courier approached the stage with his delivery, eliciting cheers from the crowd. Neo had facilitated the transaction, working behind the scenes to find the best delivery option.
eBay has since pulled the plug on same-day delivery in the U.S., but continues testing the service, using Neo, in the U.K.
Eifrem said he has seen interest in his industry explode since Neo launched its first graph database product in 2011, holding a meet-up for graph database fans that drew four or five people. Now the events bring in crowds — more than 1,000 attended the company’s recent conference in San Francisco.
Eifrem called that buzz “amazing,” but admitted it has brought more competition.
“The flip side is that in the past 12 to 18 months, IBM has announced a graph database,” he said. “Microsoft has announced that they’re working on several internally; Oracle has launched a graph database.”
There also are smaller companies to contend with. San Jose-based Objectivity, which has been in the data analysis business for two decades, started developing its own graph database two years ago to address the growing demand for the technology. Menlo Park-based Ayasdi sells data analysis software that uses graph databases and other tools to help hospitals pinpoint the best patient care, or banks uncover money laundering.
Many data companies are eyeing the giant in the room — Palantir. The secretive, $20 billion company in Palo Alto has long been known for helping government and corporate clients solve tough data problems — its technology is rumored to have helped U.S. forces track down Osama bin Laden.
San Francisco-based data analysis company Gemini advertises itself as “the poor man’s Palantir” because it offers similar services that it says are cheaper and simpler to deploy. Unlike Palantir, which custom-builds platforms for each client and sends its own engineers to help clients get set up, Gemini builds a service that any company can put on top of its existing platform.
Palantir declined to comment.
That type of competition puts pressure on companies like Palantir, Dar said.
“Once you’re so big it’s harder to pivot and incorporate emerging technology paradigms as they mature,” he said. “Can Palantir successfully incorporate these novel toolkits in a way that meets their customer expectations before getting displaced by faster, more agile upstarts?”

Source: http://www.mercurynews.com/2017/03/27/theyre-no-palantir-smaller-startups-with-new-ideas-take-big-data/​
0 Comments

How to start using big data SLAs

3/26/2017

0 Comments

 
Big data leaders, when you're asking IT pros for service level agreements, there's no need to reinvent the wheel--start the process with three basic steps.
Picture
Many companies have moved past the experimental stage of big data and turned their attention to implementing big data and analytics processing in production—they're even making some of these applications mission critical. Moving these applications to a mission-critical status requires them to be timely and readily accessible to the decisions makers who need them.
Given these circumstances, the time has come for big data service level agreements (SLAs).
The purpose of an SLA is to guarantee business users get certain minimum performance and service levels on their IT investment. SLAs are most commonly used for transactional systems, such as the ability to process x millions of hotel reservation transactions an hour or a commitment to 24/7 computer uptime for an airline reservation system.
Because big data and analytics have been largely experimental for organizations, users have yet to demand SLAs for big data from IT, and IT has not volunteered to offer them, either. It's time for this to change.
SEE: How to build a successful data scientist career (free PDF) (TechRepublic)
First, let's look at the big data user side.
If users are utilizing analytics reports and they expect IT to deliver these reports in a timely way to achieve business impact, requirements have to be defined for report delivery. In some cases, such as the Internet of Things (IoT), users will want real-time status reporting with to-the-minute alerts that are actionable. In other cases, it might be sufficient to get analytics reporting on a daily, weekly, monthly, quarterly, or yearly cycle.
A second area of user concern is the time to market for new big data applications that they want for the business. Users want these applications as quickly as possible so they can start getting business value from them.

When it Needs to get Done at 2am., That's when you can rely on CA Workload Automation
At any given moment, countless customers are making transactions that generate multiple related queries, transactions, and exchanges. And all depend on highly available workload processing night and day. But rest assured there is a solution. CA Workload Automation lets you provide a superior customer experience — anytime — by automating and simplifying complex workloads through a single point of control. Check out our eBook to see why you can rely on CA Workload Automation.

White Papers provided by CA TechnologiesNow, let's look at the services that relate to IT operational performance that must be met in order to meet business users' needs.
More about Big Data
  • How Apache Kafka promises to be your enterprise's central nervous system for data
  • How AI and next-generation genomic sequencing is helping cancer patients
  • How to build a successful data scientist career (free PDF)
  • Subscribe to our Big Data Essentials newsletter
As new big data applications are developed, the underlying technical goals have to be 1) speeding up the time it takes to develop, debug, and place new applications into production; and 2) speeding up system efficiencies and processing so that more developers can use development resources concurrently.
On the systems side, this could translate into SLAs for system performance, the ability to handle a specific number of application development users concurrently at one time, or tools that can reduce the time it takes to debug applications because of the automation they offer.
On the network side, there might be some quality of service (QoS) minimum service levels that must be met in order to facilitate a given level of concurrent big data development and testing activity.
Finally, there are the big data deliverables. For the analytics reports that must get into the hands of users instantaneously, system uptime must be guaranteed. For the analytics reports that are to be delivered daily, weekly, monthly, quarterly, and annually, analytics batch processing performance jobs must be written, implemented, and monitored to ensure that all deliverable targets are met.
SEE: Quick glossary: Big data (Tech Pro Research)
How to start using big data SLAsFew companies have well-orchestrated application and report delivery SLAs on the analytics side that can match what they have in transactional IT, so what is the best way to get started? This three-step process should help.
First, borrow a page from the guarantees that you provide business users for transactional reporting. You should sit down with users and determine which analytics they need in real time (with system uptime performance guarantees), and which analytics reports they need on the batch side (i.e., daily, weekly, monthly, quarterly, annual reports) so you can schedule the production of these reports and plan big data computing resources to produce them in the required timeframes.
Second, review your big data processing and application development approach. Most big data and analytics systems are highly distributed; they don't have all of the storage and processing "under one hood," like a mainframe or a single monolithic server that processes incoming hotel or airline reservations. Instead, big data and analytics systems feature multiple servers that all run in parallel. It is more challenging to manage workload execution in an IT environment that is spaced across multiple servers instead of just one server.
Third, don't forget about the network. It doesn't matter how well you tune your systems or organize your workloads if your internal network can't provide the bandwidth needed to support big data processing and data transport.
The good newsYou don't need to reinvent the wheel when you define metrics, service, and deliverable targets for big data analytics because you can borrow the mechanics from the transactional side of your IT work. For many companies, it's simply a matter of looking at service level processes that are already in place for their transactional applications, and then applying these processes to big data and making necessary adjustments to address unique elements of the big data environment, such as parallel processing and the handling of many more types and forms of data.

Source:  ​http://www.techrepublic.com/article/how-to-start-using-big-data-slas/
0 Comments

5 Misconceptions Small Business Owners Have About Big Data

3/25/2017

2 Comments

 
From increasing customer loyalty to improving inventory management to strengthening B2B relationships, let data help your business.
Picture
Data is not a new conversation, yet small businesses still have many misconceptions about how data may be valuable to their business. In the past, only large brands could afford to implement data into their business efforts. Today, however, data is accessible to businesses of all sizes -- including yours. Overlooking this new reality occurs far too often and for all the wrong reasons. With this in mind, it is imperative for small businesses to finally overcome the various misconceptions about data and their businesses.  
Misconception No. 1: Human touch outweighs anything automated.Entrepreneurs are a unique breed that deliver passion, excitement and cognitive abilities unlike many of their corporate peers when it comes to nurturing a business. The human touch that they offer is undoubtedly a significant aspect of what makes many entrepreneurs successful -- yet this same human touch can potentially inhibit success if it is employed at the expense of data collection.
Related: The Hidden Advantages Data-Drive Sales Teams Have
“One of the most common misconceptions is that people believe they will always outperform computers in their decision-making process. That may have been the case in the past, but with the complexity of today’s markets and the advancement of technology, this assumption no longer holds true,” says Victor Rosenman, CEO of Feedvisor.
Expanding on this, Rosenman shares what many of us already know but too often need to be reminded.
“All business owners are constantly required to make critical decisions, and the most effective decisions are not based on gut feelings, but on facts and data.”
With a reported 28 million small businesses in America, there is too much competition to dismiss the value that data can bring to your business. Combined with human touch, data is a powerful asset that small businesses should leverage instead of dismiss.
Misconception No. 2: Revenue will not be enhanced due to data.Small businesses come in all forms, but the common denominator is that they need to make money. Using data -- including artificial intelligence -- small business owners can save time and money when applying data solutions to their businesses. 
Related: The Secret to Outpacing Your Rival? Competitive Insights.
“Artificial intelligence gives small business owners and entrepreneurs the ability to run a lean operation. There are solutions in AI that range from automated call software to market intelligence to retail inventory management and even to sales, such as Salesforce's Einstein software recently launched using IBM’s Watson technology. The key here is automation, as AI can speed up or eliminate manual processes altogether. Ultimately, these tools offer insights into operations and keep costs low while enabling businesses to function at a much higher level, and see higher revenue as a result,” explains Igor Gorin, CEO of Astound Commerce.
Using data, companies can analyze what has taken place within their businesses, leveraging the findings to make future decisions. With this in mind, relying solely on human instincts is a risky proposition.
Misconception No. 3: Data should immediately solve problems.Instant gratification is nice, but instant gratification isn’t always the solution.
“The view of cognitive systems as brains that automatically solve any problem is a popular misconception. These tools are ideally suited to do things like scale human expertise and augment human intelligence,” IBM’s Brandon Buckner recently explained.
Keeping what Buckner says in mind, consider how technology can support your business rather than take the lead. Technologies such as Watson Explorer -- which is a platform that gives businesses access to various data touchpoints to help drive business performance and growth -- is an example of how businesses can benefit in their decision-making and ROI thanks to using technology. Using data, small business owners can then make stronger decisions on future business strategies.  
Related: Death to Automation: Bring Back the Humans
Misconception No. 4: Data is too broad for niche businesses. Strategizing ideas will always be on an entrepreneur’s to-do list, and it’s these moments that often shape the future of businesses. Yet too often, small businesses neglect to use data to also help shape their future, a mistake that can be both costly and potentially fatal.
“Entrepreneurs are people of conviction with big ideas. Unless their service or product revolves around data, data analysis probably isn’t their expertise. Without a specialist within the business, data gathering and analysis is often pushed to the side for priorities that are more directly connected to bringing a product or service to market . . . which is a shame because data can help an entrepreneur make the necessary pivots to improve a business very early on. One of the things we’ve seen from the Fiverr marketplace is a sharp increase over the last year in data analysis services, growing 243 percent year over year,” shares Brent Messenger, global head of community at Fiverr.com, an online marketplace offering tasks and services beginning at just $5 per job performed.
With data analysis services such as those available at Fiverr.com being both affordable and easy to access, the costs involved with data can no longer be an excuse for niche market businesses. In fact, data can sometimes even be free for small businesses, which leads us to our final misconception.
Related: From Corner Office to Field Ops: 4 Ways to Make Data More Actionable
Misconception No. 5: Collecting data will cost too much money.Not much is free nowadays, but fortunately you can collect data for free with Google’s web-traffic monitoring tool Google Analytics, which provides all types of data about website visitors and traffic sources. Using this, businesses can extract data to reveal valuable information including how audiences engage on their websites, origins of website traffic and more.
Expanding on the value of data for small businesses is Dan Wilkinson, chief commercial officer of 1WorldSync, who explains that “small businesses are often under the assumption that they should be managing their company and customer data themselves through manual processes to save on the costs of a data provider. Many entrepreneurs take this task upon themselves, spending massive amounts of time and resources inputting, updating and exchanging data with customers and partners. What they don't realize is that by investing in a solution that can collect and distribute content for them, they can expand their network of partners, eliminate errors and enhance consumer trust.”
Related: 4 Things You Can Do to Make Data Work For You
As Wilkinson points out, time is money -- and often it takes spending money to both save time and gain opportunities to make more money. Keeping this in mind, there are many resources available that are small business friendly without breaking the bank. Among them are Tranzlogic, InsightSquared and IBM’s Watson Analytics -- all of which offer valuable ways for small businesses to strengthen their overall business efforts through data. 
Finally, as you begin to introduce data into your own small business, consider the core reason that data is essential. It all directs back to your customers and target market.
“It's important to start with the customer in mind and find tools that give you insight into that customer's journey to purchase,” says Mark Sullivan, director of demand generation at CallRail.
When used correctly, data becomes a powerful tool for driving efficiency, accuracy and increased revenue for your business while also helping to better understand your customer, target market and competition.
From increasing customer loyalty to improving inventory management to strengthening B2B relationships, let data help you along the way in your unique small business journey.

Source:  https://www.entrepreneur.com/article/290594

2 Comments

Big Data firms are not immune to disruption

3/24/2017

9 Comments

 
Picture
The digital market space is vast, open to potentially innovative and disruptive platforms. Yet, in this limitless market space, allegations of market abuse are being filed with competition authorities, including the Competition Commission of India. It is widely accepted that digital markets are subject to disruptive innovation, and that this limits the ability of large firms to exercise market power if they fail to innovate. Some have suggested that network effects and “lock in” serve to remove this competition tension from seemingly dominant firms. Proponents often invoke high-profile and data-rich digital firms such as Snapdeal, Flipkart, Jabong, Ola, Amazon, Google, Facebook and Uber as industry participants who need no longer compete because their databases are so vast. “Data immunity” is the new set of arguments for network lock-in effects.
Persistence of scepticism on the anti-competitive effects of networks is not surprising. An earlier attempt at highlighting the benefits to consumers of networks found few takers despite a well-reasoned minority order of the Commission (MCX-SX v NSE). Network effects in the currency derivative stock exchange, the order argued, create depth, enabling a variety of instruments of trade to the benefit of consumers. And at zero cost. Failing to appreciate that networks and the use of data are not inherently negative, the current debate that data-rich companies often use their resources to tailor products or services and to improve the services perhaps once again needs reaffirmation.
Arguments on “Big Data” and lock-in effects have rather tended to obfuscate competition issues with concerns of privacy and data security. At a time when more and more digital firms design products and services for the benefit of consumers, arguments about network effects and Big Data echo the familiar arguments of the NSE case. This time around, what is more disconcerting is the expansion of the jurisdictional domain of the commission from competition to issues of data security, risk and financial liability.
At the outset, let us ask: Does large firms’ access to private data create an entry barrier, an advantage denied to new entrants? Data requirements in the digital era have spawned a separate market for data. As consumers, we are continuously parting with personal data, be it know-your-customer information or online shopping or using a credit or debit card at brick-and-mortar outlets, some of which surfaces in the market for data. Recent estimates quote that personal data can be bought for as little as Rs15,000. Consumer surveys are another source of access to specific data. Data is often mined from online searches as well. Access to data, essentially, is never a constraint to good business.
The counter argument that Big Data-rich firms capture consumer preferences with the help of sophisticated algorithms and are free from tensions of competition is not borne out by empirical evidence. The ground reality in India does not suggest an advantage to large firms with the proliferation of platforms, especially on smartphones. Consumer preference is a combination of social and cultural factors and in India, at least, is captured by two features—convenience and cost. Algorithms that build in these nuances are more likely to succeed. Thus, the entry of several new platforms with innovative software that enables speech-to-text in major Indian languages or the use of Artificial Intelligence to make buying easy online shows that data provides no immunity for dominant firms.
At a recent conference of the commission on “Economics and Competition Law”, participants from outside the country were surprised at the number of alternatives available for almost every activity of daily life, including purchase of vegetables and groceries, both online and offline. Where products and services are available on an equivalent or almost equivalent basis without the provision of data, data-rich firms are unlikely to rest on their laurels.
And data does not necessarily lead to lock-in. Many argue that the “positive feedback loop” of network effects means that digital customers are unlikely to switch. But because data is not a finite resource, this is usually not so. Multihoming is a well-accepted and understood feature of tech markets: A look at how many social media sites there are today and how many of them young individuals use is sufficient proof. The behavioural instinct of consumers to stick with networks is often overstated. And it certainly doesn’t apply to the younger crowd.
Attempts to raise concerns on access to data and arguments about misuse of data being a competition issue deflect the focus of the commission from consumer benefits and competition. As in the MCX-SX case, it can distort markets. Undoubtedly, concerns on data security and the larger issue of liability and risk of payment platforms have to be addressed. But are these competition issues? Do these concerns fall in the realm of the Competition Commission?
Intervention is warranted only if a firm’s position has been achieved through anti-competitive means and exclusionary conduct that does not allow competitors to compete on merits of innovation, convenience and cost. It also goes without saying that firms should not be placed at a disadvantage if they have acquired a database through legitimate means by producing an innovative product or service that benefits consumers. More significantly, the jurisdiction of the commission is competition. Crossing over to other jurisdictions of privacy, liability and security would only distort the digital space for Indian start-ups. The simple rule, “let the consumer decide”, should be the governing principle here. Indian consumers are very discerning.

​Source: http://www.livemint.com/Opinion/0CjXgKqlfYdWd1sHOBZYNJ/Big-Data-firms-are-not-immune-to-disruption.html

9 Comments

Big Data Analytics Firm RavenPack Launches New Platform

3/23/2017

0 Comments

 
Picture



RavenPack, a provider of big data analytics for financial services, announced on Tuesday the launch of a new platform that will bring big data analytics to fundamental and discretionary investors.
The new platform will consist of a self-service data and visualization platform that gives financial professionals the opportunity to analyze data for investing and trading as well as providing risk management and compliance support.
Users of the new platform will be able to monitor market-moving events and analyze these events through a wide variety of data sets, including stock prices, geopolitical events, news-flow, social media activity, payments data, weather apps, as well as data from the Internet of Things (IoT).
The objective of the new platform is to give investors "predictive insights and evaluate investment opportunities in real time."
In conjunction with the new product launch, RavenPack announced it has secured $5 million in backing from Draper Esprit, a venture capital firm that focuses on high-growth technology businesses.
Executive Commentary"RavenPack has become a vital source of information for quantitative investors," said Yin Luo, vice chairman of Wolfe Research and Wall Street's top-ranked quantitative analyst (six years in a row by the Institutional Investor Equity Research Survey). "The new RavenPack platform bridges the gap between systematic and fundamental investment managers exploring market anomalies and looking for an edge from unstructured big data."
"Our new product and backing from Draper Esprit strengthens our ability to democratize analytics on big data in capital markets," said Armando Gonzalez, CEO of RavenPack. "The new platform opens up access to unstructured data analytics which until now have only been available to the most sophisticated quantitative investors and traders."
"RavenPack's core value lies in turning unstructured big data into real-time actionable insight to generate significant results. This is particularly relevant for banks, hedge funds, and asset managers who are fast becoming data hoarders. Their new platform will empower investors across the board to better understand volatile markets. We're excited to be investing in a fintech company with a brilliant track record and look forward to working with them as they become an important cornerstone in the big data ecosystem." Simon Cook, CEO Draper Esprit.
If you’re looking for cool fintech startups and access to top financial institutions, and are sick of attending stuffy corporate conferences, the Benzinga Global Fintech Awards is the event for you. From its first year in 2015, the competition grew to over 250 applicants and over 500 attendees in 2016.

​Source: https://www.benzinga.com/fintech/17/03/9195829/big-data-analytics-firm-ravenpack-launches-new-platform




RavenPack, a provider of big data analytics for financial services, announced on Tuesday the launch of a new platform that will bring big data analytics to fundamental and discretionary investors.
The new platform will consist of a self-service data and visualization platform that gives financial professionals the opportunity to analyze data for investing and trading as well as providing risk management and compliance support.
Users of the new platform will be able to monitor market-moving events and analyze these events through a wide variety of data sets, including stock prices, geopolitical events, news-flow, social media activity, payments data, weather apps, as well as data from the Internet of Things (IoT).
The objective of the new platform is to give investors "predictive insights and evaluate investment opportunities in real time."
In conjunction with the new product launch, RavenPack announced it has secured $5 million in backing from Draper Esprit, a venture capital firm that focuses on high-growth technology businesses.
Executive Commentary"RavenPack has become a vital source of information for quantitative investors," said Yin Luo, vice chairman of Wolfe Research and Wall Street's top-ranked quantitative analyst (six years in a row by the Institutional Investor Equity Research Survey). "The new RavenPack platform bridges the gap between systematic and fundamental investment managers exploring market anomalies and looking for an edge from unstructured big data."
"Our new product and backing from Draper Esprit strengthens our ability to democratize analytics on big data in capital markets," said Armando Gonzalez, CEO of RavenPack. "The new platform opens up access to unstructured data analytics which until now have only been available to the most sophisticated quantitative investors and traders."
"RavenPack's core value lies in turning unstructured big data into real-time actionable insight to generate significant results. This is particularly relevant for banks, hedge funds, and asset managers who are fast becoming data hoarders. Their new platform will empower investors across the board to better understand volatile markets. We're excited to be investing in a fintech company with a brilliant track record and look forward to working with them as they become an important cornerstone in the big data ecosystem." Simon Cook, CEO Draper Esprit.
If you’re looking for cool fintech startups and access to top financial institutions, and are sick of attending stuffy corporate conferences, the Benzinga Global Fintech Awards is the event for you. From its first year in 2015, the competition grew to over 250 applicants and over 500 attendees in 2016.
0 Comments

Big Data is more than merely analyzing it; its what you do with the results

3/22/2017

1 Comment

 
Picture
For businesses looking to hire a data scientist, rather than looking for an individual who had the requisite skills, it would be easier to look for individuals each possessing some of the skills needed and who could be integrated into a team said Oyvind Roti, International Head Solutions, Google Cloud Platform.
Roti was speaking at Agoda Outside, a new undertaking from Agoda that focuses on ideas, research, exchanging information, philanthropy, and outreach. The event aimed to help attendees understand how big data is changing the way people travel and how companies work with big data, and how to use it better.
And it was the use of Big Data that panelists spoke about. As Yaron Zeidman, CTO of Agoda, put it, “Big Data is not only about the data, it’s about what you’re doing with it.”
Dr Andreas Weigend, Director at Social Data Lab, said that data didn’t have to be a cost to the company, but could instead be a profit and revenue generator.
Big data has been a key buzzword for businesses for some time now. While many people believe big data can be beneficial, few organizations know how to fully maximize its potential. It continues to evolve becoming a driving force behind waves of digital transformation.
Dr Weigend said that successful companies succeed by removing barriers to information rather than raising them. But, if you know all of the data you can make better decisions.
At Agoda’s new Singapore office earlier today, speakers and panelists from A*STAR, Accenture Digital, Digital McKinsey Asia, Google, GovTech, and Grab, shared in-depth principles and about how their organizations manage big data, the layers of thought processes behind each analysis, and the essential ingredients in making big data work – including culture, people, skills, tools, and privacy.
“We understand the essence and importance of big data, and saw that there is a disparity in knowing and understanding it,” said Yaron Zeidman, CTO, Agoda. “In order to better make sense of the power of the information we have on hand, we first need to understand the impact it has on various industries.”
As for the use of the data, Georges Mao, Director of Marketing Science APAC at Facebook, said it was important for businesses to reconcile the use of customers’ data along with the need of the business. “Without trust there is no business,” he said, “The duty we have is a mutually beneficial result for company and users.” He cautioned misuse by adding that if you keep delivering a message that is wrong, people will eventually tune you out.
“Agoda Outside has allowed us to use this platform to share our understanding of data in the transport industry, and creating solutions that will solve everyday problems. Knowledge sharing that will help consumers is something that Grab also supports, and we believe it is a key step in constant progress.” said Marian Panganiban, Regional Policy and Research Manager, Grab.

​Source: http://www.networksasia.net/article/big-data-more-merely-analyzing-it-its-what-you-do-results.1490137485

1 Comment
<<Previous

    Archives

    April 2017
    March 2017
    February 2017

    Categories

    All

    RSS Feed

LOCATIONS

New Brunswick, New Jersey
San Francisco Location, California
Online or on-site at your location

Contact us

Lisa Doehring
Program Manager
ldoering@cx.rutgers.edu
415 343 0264

BIGDAta@RUTGERS

The Big Data Certificate Program at Rutgers University is offered in connection with the Rutgers Center for Innovation Education and the Rutgers Professional Science Master's Degree.

SUBSCRIBE

Join our mailing list today!
Join Now