Archive for the Analytics Category

May 24 2016

Intelligent Traffic Management – Applying Analytics on Internet of Things

With extension to the previous article “Revolutionizing the Agriculture Industry” this article talks more about how Internet of Things (IoT) and Analytics is going to revolutionize the Traffic Management in this modern world. Day-by-day roads are getting deluged with vehicles but the road infrastructure remains unchanged. Congestion in cities is cited as the major transportation problem around the globe. According to Texas A&M Transportation Institute, in 2011 traffic costs $121 billion due to travel delay with a loss of 2.9 billion gallons of fuel in USA alone.

The world is becoming more intelligent, where sensors in cars, roads are connected to internet and the devices communicate the data with each other. Using this intelligent, fleets can avoid accidents, predict car failure, preventive actions for maintenance can be taken, etc. Let us say hello to John.

Our John drives to office every day. His intelligent car gets data about various on-the-road events like accident, etc that took place ahead in his route and will tell guide him to take an alternative efficient route. With this intelligence John reaches his destination and his car informs him about the available parking slot! These events will not only enormously help John to save time and fuel but give him a tension free life. Are there any such services like this available already?

 

Meet Zenryoku Annai

Zenryoku Annai is a service which is provided in Japan by Nomura Research Institute (NRI). Using this service, subscribers all over Japan can plot out the shortest travel routes, avoid traffic snarls and estimate what time they will arrive at their destinations. It compounds information from satellite navigation systems linked to sensors at fixed locations along roads with traffic data determined through statistical analysis on position and speed information from subscribers, moving vehicles and even pedestrians. Meanwhile, data from thousands of taxicabs is added to the mix. Using all this information, Zenryoku Annai analyzes road conditions and helps drivers plan routes more accurately and over a wider range than is possible with conventional GPS systems. Since more vehicles are being added to it over a period, they were using in-memory computing technology, which has improved the search speed by a factor of more than 1,800 over the department’s previous relational database management system, that is 360 million data points can be processed over just in 1 second.

satnav_ProbrTec

Fig: 1.1

Fig: 1.1 In a conventional SatNav system, road conditions are known at locations only where sensors are installed, but in NRI’s probe technology roads conditions can be determined much more accurately with position and speed data delivered from in-car units and mobile phones, sensors. Thereby it will also suggest the best alternative route for the user. Though Zenryoku Annai is not fully skilled, since all the vehicles in roads are not connected to it. But very sooner once can expect everything that runs on road to be connected to the internet.

 

Driverless Cars: Another example!

Driverless/Automation car is an ideal example, where IoT and Real time Analytics plays a crucial part. When a car goes down in a road it actually starts interacting with the signals, vehicles through various sensor points, accordingly it routes to the shortest distance itself by calculating the congestion & distance to reach the destination in a minimum duration. Pilot test run of Google cars (fig: 1.2) inferred that, it generates 1 gigabytes of data per minute from surroundings and these cars entirely dependents on IoT and analytics to take decisions.

trafficManagement

Fig: 1.2

 

Conclusion

IoT with Analytics is in the very nascent stage and there are many barriers to restricts its growth. i.e. security of the data, privacy of the individuals, implementation problems and technology fragmentation. An average American commuter has spent 14 hours/year in 1982 but in 2010 it has surged to 34 hours/year, if this problem is unsolved, then it may even boost up to 40 hours/year. With the density of the population is exploding rapidly with the shortage of space, it is impossible to increase the capacity of roads but rather, a practically viable option is to use the power of data analytics on IoT.

Disclaimer:

Views expressed on this article are based solely on publicly available information.  No representation or warranty, express or implied, is made as to the accuracy or completeness of any information contained herein.  Aaum expressly disclaims any and all liability based, in whole or in part, on such information, any errors therein or omissions therefrom.

 

References

 

May 12 2016

Big lessons from big data implementation – Part I

Each day 23 billion GB of data are being generated and the speed of generating big data is going double in every 40 month! Apart from their business data, organizations now also have humongous data available from google, Facebook, amazon, etc. They wish they can use all the available data to find useful information for doing their business better.  Let us look into big data deployment of a few organizations and learn from their experience.

Case 1: Rabobank

Rabobank is a Dutch multinational banking and financial services company headquartered in Utrecht, Netherlands. It is a global leader in food and agro financing and sustainability-oriented banking. Rabobank started with developing a big data strategy July 2011. They created a list of 67 possible big data use case. These use cases included:

  • To signal and predict risks, prevent fraudulent actions that the bank is running
  • To identify customer behavior  and to obtain a 360-degrees customer profile;
  • To recognize the most influential customers as well as their network;
  • To be able to analyses mortgages;
  • To identify the channel of choice for each customer.

For each of these categories they roughly calculated the time needed to implement it as well as the value proposition. In the end the Rabobank moved forward with big data application to improve business processes as the possibility for a positive ROI. A dedicated, highly skilled and a multidisciplinary team was created to start with the big data use cases. They were using Hadoop for analyzing big data. They selected social data, open data and trend data were integrated. So there data approach with a deluge of semi and unstructured mess. Hadoop is only part of a big data strategy. The key to success was the multidisciplinary team and that they embraced uncertainties and accepted mistakes to be made help them to overcome situation.

Problems faced during implementation

Rabobank didn’t store raw data, due to the costs and capacity issues. The data quality was not constant and the security issues were very high. Rabobank noticed that it was often unclear who owned the data as well as where all data was stored. Hadoop is different from older database and data warehousing systems, and those differences confused the users.

Lessons

  1. Specialized knowledge as well as visualizations is very important to drive big data success.
  2. Start with the basics & don’t stop at stage one. Big data implementation is continuous journey to reap data-driven insights.
  3. Not having the right skills for the job can be a big problem.
  4. The dangers of underestimating the complexity of a big data system implementation so focus on data management.

Case 2: OBAMA CARE

In 2010 the newly elected president of the United States of America government introduced Patient Protection and Affordable Care act. The main purpose of this act was the best of public and private insurance coverage for the population, and thereby controlling and reducing healthcare costs and requires them to interact with the government via a website to do so. The system is in essence a big data implementation problem with data being collected on a potential population in Excess of 300 million people across the entire country. Unfortunately the project not progressed as planned, and has become mired in technological controversy.

Problems faced during implementation

  • This act brought the country to default on its debt
  • Cost of Obama care – $1.6 trillion
  • Estimation of cost 2014-2024 – $ 3.8 trillion

Anticipation to prevent the problem:

  • They can take special knowledge as well as visualizations can prevent the loss.

Lessons

  1. The dangers of underestimating the complexity of a big data system implementation to focus on data management.
  2. The prior analysis and prediction complexity of data can prevent cause.
  3. Most of the data collected and stored in an agency’s transaction processing systems lacks adequate integrity so make sure that captured data meet integrity standard.
  4. Specialized knowledge as well as visualizations is very important to drive big data success.
  5. Not having the right skills for the job can be a big problem.

 

We shall analyze a few more cases tomorrow. Keep watching this space.

Disclaimer:

Views expressed on this article are based solely on publicly available information.  No representation or warranty, express or implied, is made as to the accuracy or completeness of any information contained herein.  Aaum expressly disclaims any and all liability based, in whole or in part, on such information, any errors therein or omissions therefrom.

References:

  1. The process of big data solution adoption By Bas verheji.
  2. Case study on Rabobank by Hewlett Packard enterprise.
  3. Big data for all : Privacy and user control in age of analytics by Omer Tene and Jules Polonetsky
  4. Realizing the promise of big data implementing the big data projects by Kevin c Desouza
  5. Case study on Obama care big data problem on patient protection and affordable care act
  6. http://www.mckinsey.com/business-functions/business-technology/our-insights/big-data-whats-your-plan.
  7. http://www.businessofgovernment.org/blog/business-government/implementing-big-data-projects-lessons-learned-and-recommendations.
  8. https://hbr.org/2012/11/the-analytics-lesson-from-the.
  9. http://dataconomy.com/hadoop-open-source-software-pros-cons/.

 

May 11 2016

Revolutionizing the Agriculture industry – Applying Analytics on Internet of Things

Analytics on Internet of Things

Internet of Things (IoT) and Big Data analytics could be the two most buzzed terms in the Industries for the past 2 years. IDC has forecasted to yield $8.9 trillion revenue by 2020 in IoT and Goldmann Sachs has estimated that 28 billion devices will be connected to internet by 2020, which indicates that each of this connected devices will shoot back with humongous amount of data each second, therefore you need a proper analytical tool/solution for the “Value creation”.

Introduction to IoT:

IoT is a network of inter-connected objects able to collect and exchange data. This eco-system enables entities to connect to, and control, their devices. The information that performs the command and/or sends back the information back over the network to be analyzed and displayed on the remote. For example: At 6am John’s phone receives a mail informing that his meeting has been pushed back, now his mail service tell his smart clock to give him an extra 30 minutes of sleep and alerts him to the change once he wakes. This is how the whole system works, but with the addition of Analytics to this prototype will enable the firms to take an efficient and best decision with the available data.

Analytics on IoT in Agricultural Industry:

How will this trend influences agriculture industry? The answer is, it is going to entirely revolutionize the whole working pattern of the industry rather than just influencing. Because in 2050 global population is expected to reach 9 billion people (34% higher than today), in parallel food production should be increased by at least 70%, but in other hand U.S Department of Agriculture states that 90% of all crop loses is due to weather related incidents. Thus, if we could minimize the losses due to weather related incidents and use the limited fresh water resource as draught is prevailing across the globe (to the fact 70% of the world’s fresh water is already being used for Agriculture). And so, this issues can be resolved by using predictive Analytics, which is going to play a major part in building value creation. Predictive analytics acts as a central element in predicting the future picture which requires a lot of input data from various distinctive variables. The basic idea is to identify and differentiate between the high and low yielding crop lands by measuring its productivity.

Flint River Valley project:

Hyper local forecasting techniques will assist farmers to overcome the above obstacles. The Flint River Valley is a part of Georgia’s agricultural industry, it roughly contributes around $2 billion annually in farm-based revenue. A pilot test run is been made in Flint River valley in USA, by researchers from the Flint River Soil and Water Conservation District, the U.S. Department of Agriculture, the University of Georgia and IBM. The primary objective is to give farmers a beneficial information about the weather through analyzing the data obtained from various sensors (fig: 1.1) which is installed over fields. Here sensors collect data like temperature, moisture level in air and surroundings which is made to blend with satellite data. They use this data in Variable Irrigation Rate technology which enables the farmers to conserve water using sprinklers which will turn water off over areas that don’t need water and turn them back on over areas that need water.

Fig: 1.1

img1

Fig: 1.2

img2

Fig: 1.2 shows the cloud water density, that is water content in the cloud which is important in figuring out which type of cloud is going to form and helps to determine the cloud formations that are likely to occur, which is extremely useful for weather forecasting

According to IBM, farmers will be able to track weather conditions in 10 min increments up to 72 hours in advance. And a full 72 hour forecast will create data around 320 gigabytes but while each individual farmers will require a small tranche of it in a personalized way. They are also building a weather model with 1.5 kilometer resolution for the farmers. It is estimated to save 15% of the total water that is been used in irrigation that is about some million gallons per year. The costing comes around $20 – $40 per acre for first 3 years.

 Fig: 1.3

img3

With geospatial mapping, sensors and predictive analytics, farmers will be presented with real time data in time series, graphs at a granular level. Soil quality, field workability, details on nitrogen, pests and disease, precipitation, temperatures, and harvest projections with even predicting the expected amount of revenue in relation to the commodity’s market trend and all of this is been analyzed and reported via a smartphone ( Fig: 1.3 ), tablet, or desktop. In future it may even become mandatory to use IoT and Analytics in agriculture industry for sustaining and growth, in order to maximize its revenue in multiple folds with the minimal use of resources.

Conclusion:

The IoT is on its way to becoming the next technological revolution with $6 trillion to be invested before 2020, and a predicted ROI of $13 trillion by 2025 (cumulative of 2020-2025). Given the massive amount of revenue and data that the IoT will generate, its impact will be felt across the entire big data universe, forcing companies to upgrade current tools and processes, and technology to evolve to accommodate this additional data volume and take advantage of the insights.

References:

 

 

May 10 2016

Dawn Of Online Aggregators – How Business Analytics enabled them

Online aggregators are websites which bequests in the e-commerce industry by stockpiling the information about various goods and services and conglomerate them from several competing sources, in their websites. The aggregator model assist consumers bestowed, customized, tailored, and cater for the needs and wants of the consumers, later adding value to their feedback and services by revamping their shopping experience.

Who are the Online Aggregators?

They have been graciously welcomed by both the end players- customers and businesses, since it enhances the sale and make a good reach of the product/service to the customers, thus and thus benefits the end players. They have been lustering across many industries like travel, payment gateways, insurance, taxi services, or some firms open up secondary market over internet like letgo, Locanto, vinted and in food ordering services like    Campus food, Gimmegrub, Diningin, GetQuick et al.

Business Analytics – As a key factor for the aggregator’s triumph

For bringing in such a dynamic yet strong change in the e-commerce industry, from what has been traditionally followed, what pitch would have had the aggregator firm taken up?  What paved as a base for the firm to venture in this space? It is possible by unlocking the marketing data and turn its inside out. This job could be well chipped by business analytics, since it could be portrayed intersection of data science with business.

Business Analytics is the study of data through statistical and operational analysis. It focuses to bring out new insight based on the data collected and utilize them in enhancing the business performance. It is closely related to managerial science, since it is extensively fact-based explanatory and predictive model using statistical analysis to drive management in decision making.

Uber- A study on how Business analytics has augmented their business 

In an online aggregator like Uber, this is spread across the world Business analytics plays a crucial role. Uber is an app based technological platform which links the passengers who are up to hire a taxi, to a driver who is ready to assent a ride and Uber takes 20% of the cab fare as its commission and the remaining is pocketed to the driver. The firm has rooted its trial across 444 cities worldwide. In a city like New York it’s whisked around by 14000 thousand Uber cabs while the unorganised hold 13500 taxies. In Los Angles among 22300 cabs 20000 are registered under Uber.

Market share of Uber

Fig 1: Market share of Uber

 

Uber maintains a huge data of all its drivers, users and the details of the every city in which it exists so that it can instantly match the passenger with the driver who is nearby. In USA, the traditional taxi meters charge the passengers based on the duration of the rides. But, Uber follows a patent valid algorithm that use the particulars of distance and the duration of the trip. This technique which is used by Uber is called Surge Pricing. A lone feature of this technique is that, the price is multiplied in terms of the surge time, which is when the traffic is overflown, for which the firm had to make a study on the traffic in the New York City (Fig 2). But the passengers are previously warned about the magnifying rate over the normal once. This algorithm advocates the driver to stay back at home due to the shortfall of rides or encourage them to get behind the wheels to gain the extra money while the city is in traffic.

Traffic hours in the New York City

Fig 2: The analysed result of Traffic hours in the New York City and the Surge price that the Uber app informs the passenger about the multiplied charge.

 

In a study made, Uber inferred that in New York passengers are travelling from almost same locality to almost same destination. When surveyed, majority of them agreed to share their cab even if it’s a stranger. Uber Pool is a service provided by Uber, where a passenger could track another passenger who is waiting to aboard, on the way to be picked up and have to reach almost the same destination. This lets a passenger share the cab with a stranger and cuts down the cost.

Operation of Uber pool

  Fig 3: Operation of Uber pool

Uber lets the passenger rate about the driver by the end of every trip based on this knowledge about the city roads, professionalism, driving ability, car quality and punctuality. This is to evaluate the driver and educate them with the skills or even at an extent to rusticate them from the service.

Distribution of drivers by rating and self-assessment chart to the drivers from Uber app

Fig 4: Distribution of drivers by rating and self-assessment chart to the drivers from Uber app.

 

Conclusion

In the case­ of Uber’s journey on being an online aggregator, the business analytics have come in handy to evaluate them among the market players, to know about their customer needs and driver’s attitude, to come up with a new strategy like Uber pool letting the passengers to share the taxi and its fare, Rating system to make managerial decision in favour of or against the taxi drivers, to come up with a contemporary pricing method – surge pricing, which charges the passengers based on the changes in demand.

A firm is bound to realize business analytics while making over any managerial decisions, when making a strategic move, when casting a new product or services because this unloads the assumptions and gives in a firm and statistical data in various business obligations. It helps a management make the decision faster and improves the critical performance with the precise data in hand. Business analytics is committed and aids to procure, sustain and reduce the churn rate of any business entity. The analysis subsidize more insights about the market and find the target customers, evaluate the impact created over them due to the changes made in price/ service over the product and realize their expectations. So, Business analytics would be the brain of an organisation to take proactive decisions and plan the business for maximum success by looking into the future.

Reference:

(The data and information used in this article are utilized from the referred sites and documents, and are not self-generated.)

https://www.linkedin.com/pulse/amazing-ways-uber-using-big-data-analytics-bernard-marr

https://georgianpartners.com/data-science-disruptors-uber-uses-applied-analytics-competitive-advantage/

http://www.economicpolicyjournal.com/2015/08/how-big-is-uber.html

http://simplified-analytics.blogspot.in/2011/04/why-business-analytics-is-important-for.html

http://www.computerweekly.com/feature/Business-analytics-make-for-smarter-decisions

 

 

 

 

 

 

 

 

 

 

 

 

Sep 19 2015

Building Data Science in your organization – Is it really important?

Analytics for businessDoes data science bring value to your organization? Hang on, what is this data science really? Why should we really care about? You got to really care because this one might change the way you run your business in the near future whether you like it or not. Just like Software/IT changed the world a few decades before!  IT is omnipresent and those who didn’t care to change the wheels suddenly had to pay steep price to change their course. Today, thankfully the cost of IT adoption is very low since the industry is much matured to provide right solutions at low price. But the initial period was very crucial. People were extremely cautious about implementing, evaluated ROI, questioned why they should invest considering there were new systems, servers, recruitments, etc. Those were the times where clear ROI from IT cannot be calculated. IT was nascent. There were gross blunders like Y2K issues. But the world got changed and IT has touched almost all facets of life.

Whether we like it or not, data science is going to bring another transformation in the day-to-day activities. There is a wide perception that data science is applicable for bigger organizations and not for small and medium businesses. Of course, there are successful early adopters. A few have burnt their fingers with analytics adoption. Money ball showcases how Billy bean uses data science effectively for making a good team to compete in base ball. Of course, there are several criticisms for the same strategy in other games. The success requires careful adoption to the business context.

The analytical adoption is getting more matured. It is time for the companies to realize quick data driven insights, if they want to achieve a competitive edge over other companies. How should one adopt the analytics strategy? Come and attend our analytics event “Analytics for CXO” -a must not miss program for the CXOs who want to adopt/implement analytics for their organization.

  1. Update yourself with bleeding edge analytics developments happening in the industry
  2. Customized Analytics roadmap specific to your organization/department
  3. Consultation with industry experts – subject to prior appointments.
  4. Practical use cases with analytics implementation, benefits and value
  5. Focused group discussions on the benefits , challenges, issues, faced by the organizations
  6. Learn best practices from the industry, academia and peers
  7. Analytics Jumpstart Kit – Small data or big data. This Kit is definitely a must have for your organization.

Registration @ Explara -  https://in.explara.com/e/analytics-for-cxos
Date: Saturday, November 21, 2015, 9:00 AM to 5:00 PM
Venue: IIT Madras Research Park, Chennai

Mar 14 2015

DATA SCIENCE CONCLAVE 2015

DS1

The 2015 DATA SCIENCE Conclave, took place on February 20-21 at Hotel Rain Tree, Chennai and featured keynotes, panel discussions, breakout sessions, “lightning” talks, etc.  A special thanks to our sponsors Target India and Contact Singapore for really making this event a grandeur success!!

Agenda for enabling the Data Science Conclave…

Day-1

DS3The Event opened with a featured keynote address by Mr.Rajesh Kumar, the Founder & Managing Director, AAUM Research & Analytics Pvt Ltd.Rajesh emphasized the need for data science practices in organizations and how insights from datascience is transforming the data-driven world.

DS4Taking it forward, Ms. Parvathy Sarath, Director & data evangelist at AAUM, with extensive experience in finance, retail, social media, human resource and Government, opened a featured presentation on Data Science –learn, develop and deploy with her Lead Data Scientist Mrs. Praveena Sri. They explained the importance of R-tool, understanding the data through R-tool, Data visualization and Predictive analysis and how beneficial it could be for an organization in the longer run.

DS5In the second session of Data Science-learn, develop and deploy analytics in your organization, Mr.Rajesh Kumar and Ms.Parvathy Sarath dealt with the topics like Logistic analysis, Multivariate analysis, Decision trees using R-tool.
Mr.Bala Chandran
, a Hadoop developer with AAUM, had come up with the presentation on Big Data Analytics, Explosion of Big Data, tools and techniques involved in exploding Big Data.

DS6Mr.Elayaraja and Mr.Sankar Sundaram of Mobius Knowledge Services came up to integrate analytics to Big Data. Their agenda included Web evolution, Web Pattern matching, NoSQL and Probabilistic models.

DS7Mr.Bala Chandran from AAUM Research and Analytics, opened a featured presentation on Cloud for Data Science. He covered various topics like computing platforms, cloud services, Cloud Deployment tools.
These sessions brought together selected experts from around the corporate world to take opportunity of presenting their knowledge leading to a better understanding of specific challenges and opportunities for Data Science in the sectors of society and economy.

DS8The last session Do it Yourself, built on conversations and work done in the previous sessions of the day, helped the participant to test his understanding on Data Science and ensured inclusion and broad participation so that everyone gets benefited from the conference .

Day-2     

In parallel fashion, the second day events moved away from broadcast formats, that treated everybody the same and evolved towards discussions that allowed the individual participants to learn what they needed to learn, as well as connected with peers and peer organizations that have real value for them.

The Day was started by Mr. Naveen Gainedi, the senior group manager of Analytics and Reporting of Target India, had come up with the Data Science practices of their organization. Their presentation facilitated focused discussion on the Elements of a Data Science Practice and Building Data Science teams.

To look at the other side, how a startup company has built Data Science Practices in their organization, Mr. Velumurugan, a head of Big Data Analytics Practice of Altimetrik had come up with resources, challenges, processes and frameworks of establishing data science practices.

The Panel Discussion-1 on Technology Spend was headed by Ms. Bharathi Muthu, the General Manager of IBM software market management for South Asia. She did an exceptional job with bringing the delegates in the panel discussion and managed to keep them engaged throughout the entire talk. The other panelists are the delegates from top companies like Mr. Nitin Chaudary, the head of Products and Technology function of Samunati, Mr.S.M.Bala Subramaniyan, Mentor and Strategic Advisor and Mr.Satya, Manager at Hp Analytics.

DS10

The Panel Discussion-2 on Predictive Analysis was headed by Mr.Naveen Gainedi of Target India. He moderated the panel discussion along with other three panelists Mr.VRK Rao from CTS, Mr.Karthik Karunakaran from Mobius, Mr.Velmurugan from Altimetrik and showed how to ultimately engage the participants and organize a captivating panel.

DS9

The Panel Discussion -3 on Machine Learning was headed by Ms.Madhumitha from Wikimedia. She clearly showed that herself and the panelists, Prof.Ronojoy Adhikari from Institute of Mathematical Science and Mr.Dorai Thodla, a chief mentor at Build Skills, made the participants become active contributors in the discussion.

DS11

AAUM, Target and Mobius teams came up with their real-time case studies to create classroom environment for group analysis and discussion, while simulations immersed participants into an experiential situation.

DS12To summarize the entire two day event, an Insightful interactive session was handled by Mr.Rajesh Kumar and Mr. Bala Subramanian to improve learning, interaction and engagement among the participants.

DS13Finally, the time to the end of the conference!!
A deep sense of appreciation to all the speakers and participation was given by Mr.Sridharan, Vice President of AAUM Research and Analytics.

It was indeed a great event that had like minds gathering in one place and ultimately having the common goal of learning, Developing & Deploying Data Science in the Organization.

DS14

This event was fittingly an event focused more on evolving Data Science as a trend setter for future Analytics and to change the entire way of looking at business excellence in rather more insightful way.

With more & more events to come, it definitely paves way for a new era of Analytics for business..!! An Insightful Business..!!

DS2

Aug 8 2014

Walmart B2B online – An opportunity for Indian Retail?

Customers shop at a Best Price Modern Wholesale store, a joint venture of Wal-Mart Stores Inc and Bharti Enterprises, at Zirakpur Is organized retail killing the kirana shops in India? Things were pretty different a few years ago. Many people declared end of the days for kirana Shops. But we have not seen any such major change but rather a few very interesting occurrences. The organized retail outlets did kick off in great way but did not kill the Mom and Pop shops. Infact, everybody grew in the booming Indian economy. Long story short… Mom and pop shops not only survived, they scaled their operations and expanded in more locations! Enter Walmart. What people mentioned about Walmart’s business In India, a few years ago was totally different from Walmart’s B2B operations. Walmart’s B2B business exclusively targeted Small/Medium business. Walmart B2B enabled Kirana store owners to become “Best Price Modern Wholesale stores” and provided benefits such as:

  1. There are an estimated 12 million kirana stores in India of which as much as 90% are not directly serviced by India’s FMCG majors. Best Price stores offer them access to quality products at the lowest prices they need and when they need them.
  2. Assortment, service and store layout of Best Price stores are customized to their specific needs to help them get the benefit of high quality products at best prices to enhance their business profitability.
  3. Best Price stores help kirana stores manage their inventory better by enabling them to purchase in quantities they need and at the time they require. They can hence take advantage of Best Price store by using it as their own godown, thus freeing up their capital for business rather than lock it up in inventory.
  4. Best Price stores have also created an innovative kirana model “My Kirana” that is tailored for kirana stores to provide them training and insights into areas such as assortment planning, hygiene, in-store displays, inventory management, value added services etc.
  5. In addition, different education programs for members with customized modules like taxation, food preparation, food safety and category workshops have also been introduced for different target segments.

Walmart has indeed adopted an approach to benefit small/medium business and to get benefit from their growth. This definitely helps the kirana shops to gain a very good opportunity to procure quality items at the required/convenient time. Now they are extending this in e-retail format.

imagesWell, this also creates a huge opportunity to these local shops to scale to the next level if they could adopt innovative strategies. Yes, data driven insights will definitely come to their rescue. They definitely need to protect their loyal customers against the competition to sustain and grow in this market. Those who quickly embrace, will stand out and emerge winners in the neighborhood. These Davids can definitely win Goliaths. Data driven analytics strategies will help!

Aaum’s geniSIGHTS solution has been helping retail customers to quickly embrace analytics and power their day-to-day business operations with  intelligent decisions. Know more about Aaum’s retail/eTail operations at http://genisights.com/retail/ and http://genisights.com/ecommerce

Reach out to the author by mailing your queries/suggestions to info@aaumanalytics.com

 

 

 

 

Jul 22 2014

Analytics for eTail

We are glad to share that our team successfully organized the second meetup titled “Analytics for eTail” at IIT Madras Research Park. The meetup serves as a platform for the business firms to understand the relevance of analytics and how it can improve their day-to-day business operations. The meetup is alive at http://www.meetup.com/Analytics-for-Business/about/ .

Ms. Parvathy Sarath, Director of Aaum Research and Analytics introduced various eTail topics to the business firms. WP_20140721_014The various eTail topics discussed are

  • Monitoring campaign Performance
  • Price sensitivity analysis
  • Optimization techniques
  • Social media analysis
  • AB testing
  • Recommendation
  • Market mix modelling
  • Sales attribution analysis
  • Heat map generation techniques
  • Loyalty measurement and analysis and
  • Dynamic Pricing

Meetup 2 - All

The meet up spanned for two hours. The participants showed lot of interest in understanding the concepts and to see how those techniques could be adopted for their business. Click here to view the presentation delivered on our geniSIGHTS solution. The participants showed keen interest in attending future Meetups conducted by Aaum.

 

Jul 16 2014

The vital click!

Attribution modelling is a widely used tool in businesses that helps the marketers to understand the impact of various marketing channels have on the ROI of their businesses. The insights from this simple exercise allows the marketers to track and analyze the multiplied touch points in sales and their impact on the conversion value. This in turn helps the marketers in effective credit allocation of their marketing budget on the various credit channels.

With the behemoth amount of data, digital marketers are now shifting their focus on attribution for the purpose of increasing their conversion rates. Low online conversion is generally overcome by optimized customer service by personalizing customer experience management, web analytics and enhancing the use of feedback. Conversion path analysis is done to convert the normal website visitor to a paying customer or even a subscription to a newsletter may be considered as a conversion. A simple layout of a path conversion is shown as follow:

conversionpath

When the website visitor is on the landing page there is a requirement to enhance his user experience by providing relevant information about the products and by establishing an emotional connection to the brand. To avoid distraction, it is required to provide focused content and targeted conversation and providing highly flexible pages for easy operation by the users.

Conversions generally involve more than one channel and conversions generally travel down the multi-channel funnel.

The channels could be:

  1. Paid-search

  2. Organic-search

  3. Social media

  4. Referrals

  5. E-mail

  6. Direct

For example, one may first read about your product on a blog-post. Then, he may see a display-ad. Later, he may read a review on some website. Curious about your product, he will see your PPC ad. After that, he visits a product comparison website. He then clicks on an organic search result which then reaffirms a customer to buy your product and he finally purchases your product.

In the example, we see that a ‘buyer persona’ visits various channels and finally decides whether he wants to purchase a product or not. A marketer would now want to assign credit to the different channels which assisted directly or indirectly in the conversion process. This set of rules that govern the assignment of credit to the various channels is known as attribution. For this process there are various attribution models available. These are:

  1. Offline-Online attribution model: This model determines the impact that the digital marketing channels have on the offline marketing channels and vice-versa. This model understands the influence that digital marketing channels have on the offline marketing channels and how to assign credit to them.

  2. Multi-Device attribution model: This model determines the impact that various devices (Laptops, Desktop, Mobiles, Tablets, etc.) have on conversions and how credit is to be assigned for various devices.

  3. Multi-Channel attribution model: This model is the most common model that is used in the industry. It studies the effect that various digital marketing channels have on the other for conversions and how the credit is to be assigned for the various channels.

In the case study available here we have tracked and analyzed the various metrics driving to multi-channel attribution modelling that is commonly used in the E-Commerce market. The model also extends further to showcase a data driven hybrid approach to attribution modelling that combines the intelligence of more than one attribution model based on the customer profile to channelize the credit assigned effectively. The outcome of such an exercise would be similar to a plot below which depicts the conversion value attributed from each channel which is used in your conversion path. 

hybrid_attribution

The assignment of credit is essential to understand which channel plays a more important role in assisting the conversions. We can determine on which channel we would want to spend more money and time. Thus, at a minimal cost we would be able to improve the efficacy of conversion rate. Thus the attribution model is becoming an integral part in the lives of digital marketers.

Jun 25 2014

Marketing Mix Modelling from Multiplicative Models

E-Commerce or Digital Marketing has emerged as a separate and rapidly growing business domain with many businesses thriving solely on them. With internet acting as a strong channel to perform business, there also arises a simultaneous need to use this channel effectively to add value to your business.

 

2014 – The year of Digital Marketing Analytics

According to a popular study on the Forbes magazine, 2014 is the year of Digital Marketing Analytics.

DMA Businesses that use digital marketing analytics for improving customer acquisitions, increasing brand loyalty, increasing ROI from their marketing channels, etc have a better competitive advantage over other businesses who have not yet ventured out into this space. The internet search giant company Google Inc, has also created a massive impact in the field of E commerce analytics through their Google Analytics platform which provides a variety of KPI’s and statistics to help digital businesses track and measure their online sales and marketing. AdWords, E commerce reporting, real time analytics are some of the popular services provided in this platform.

 

Digital Media Analytics

While these services are good enough to track high level trends and provide basic directional metrics, there is often a need to dwell the past data much deeper with sound understanding and usage of advanced analytical techniques to get better insights. One such area in the field of digital market analytics which relies heavily on the use of econometric models for decision making is Market Mix Modelling. The Gartner IT glossary defines Market Mix Modelling as “analytical solutions that help marketers to understand and simulate the effect of advertising (volume decomposition), and to optimize tactics and the delivery medium”.

In simple terms Market Mix Modelling refers to the estimation of statistical models to measure and analyze how various marketing channels are effectively contributing to your sales. If is often used to find the optimal mix of the market channels and forecast future mix which would help one to maximise the ROI from various marketing channels and sales. In particular it helps the business user address key questions such as:

  1. How do we decompose our sales to key drivers to understand the combination of these channels which would contribute to sales, market share or profits?
  2. How are these key drivers impacting sales over time?
  3. How do we calculate the efficiency or ROI from these market channels?
  4. What happens when we change the budget by channel

 

Market Mix Modelling – Methodology and Illustration

Market mix modelling relies on three popular model forms to understand the impact of various market mix variables on sales. Users use either of the three based on their requirement from the analysis.

 

Table 1. Summary of Market Mix Models functional forms

Table_MMM

Additive models tells us how much of a change is generated in the response variable in unit terms with one unit increase in the explanatory variable and is only used in scenarios where the impact of each additional unit of the explanatory variable is identical. For example this model is not suitable in determining sales decomposition of seasonal brands such as yogurt or ice creams. However the estimation of additive models are easy and sales decomposition can be directly implemented to understand the variable contributions. Semi logarithmic models on the other hand show how much the absolute level of the response increases/decreases proportionately with the independent variable. An even better logarithmic model gives us the variable elasticities which exactly measures the responsiveness of the independent variable to the dependent variable. This model comes very handy to understand how price and promotion discount elasticity impact sales and hence the most preferred model in industry standards. However modelling and decomposing logarithmic or semi logarithmic models which are in the multiplicative form are indirect and quite challenging. A good understanding of the data and techniques is very essential to model market mix variables in the multiplicative form and to decompose it to arrive at something similar to the below figure.

 

Figure – Decomposing sales to key drivers

Market Mix Modelling from Multiplicative Models

In the above figure sales of a popular ice cream brand has been decomposed to understand how the key drivers such as its own price, the competitors spend, and its market channel spend from Press and TV impact the sales volume of the product. Sales volume has been decomposed here to its base effect (shown in blue) which usually can be interpreted as the sales derived from the brand value of the product and the incremental effect derived from the sales drivers. Impact of the market mix variable TV can be seen clearly in the months of Jun/July where the TV spend (shown in yellow) is made. A multiplicative model has been implemented here to capture the right effects of the variables and to understand the seasonality of the product sales as the product has a very seasonal demand.

 

A link is available to experience the true value from Market Mix Modelling here.

 

Right information at the right time through the right means

Adoption of modelling techniques in business and the insights gained from it are vital to the survival of any organisation. Here in this article I have showcased how the concept of market mix modelling developed in the right fashion could add much value to your business and help you foresee things which you could not have before. Right information at the right time through the right means is really worth your investment of time and money!

 

 

 

References:

[1]http://www.forbes.com/sites/jaysondemers/2014/02/10/2014-is-the-year-of-digital-marketing-analytics-what-it-means-for-your-company/

[2]http://analytics.sd-group.com.au/blog/additive-versus-multiplicative-marketing-mix-model/