Jun 15 2016

geniSIGHTS: Your Analytical centre of excellence

“Journey : from a concept to a company”

We are what we repeatedly do. Excellence, therefore, is not an act but a habit and Aaum has a habit of always being excellent. The company Aaum, founded by IIT alumnus at IIT M Research Park, have grown steadily form 2008, May 19 as Aaum Research and Analytics till date, and has given rise to a subsidiary company called geniSIGHTS. The soft launch of geniSIGHTS was celebrated at IIT Research Park, Chennai on 2016, June 10.
The event started by 3 PM, warmly hosted by Ms Parvathy Sarath, Director and Data Evangelist of Aaum. With the pride of Aaum’s existence over nine years, Parvathy introduces the guests about Aaum and its journey along time- from a 1 to 25+ member company, from an IITM RP incubate to a successful graduate. She revealed the creation of Aaum as a company by two IIT Madras alumni, a set of first generation entrepreneurs with a passion for analytics and a zest to make a significant contribution to the analytics industry. Parvathy counted upon the moment as a life giving event to the dream project geniSIGHTS for many at Aaum. She construes evolution of geniSIGHTS as a concept three years ago due to the need to enable companies with business insights varying from basic reporting/ dash boarding capabilities to advanced analytics functionalities, by the research wing of Aaum who consistently worked to transform this concept to a bleeding edge, technology agnostic, affordable and customizable platform that helps the business scale analytics as per their requirements.

Ms Parvathy took the fortuity to thank all the customers, investors, advisors and the team for their belief, support and contribution in accomplishing geniSIGHTS. She cordially welcomed Mr Mohan Narayanan -Founder and CEO of  Kubos Consultancy Services, Srinivasan (Vishy) Viswanathan -Co-Founder at Ultimate Business Advisors LLP, Sriram Sampath -Vice President at Servion Global Solutions, Vijay Babu -Managing Director of Start Smart Labs India and Sridharan J S Co-Founder & Sr. Vice President of geniSIGHTS Pvt Ltd to light the lamp of glory and inaugurate the ceremony for a success ahead, for there are many events in the womb of time, which is to be delivered by geniSIGHTS in journey forward. Later the session was handed over to Mr Rajesh Kumar, Founder and Managing Director of Aaum Analytics.



“The woods are lovely dark and deep. But I have promises to keep, and I have miles to go before I sleep.”                                                                                                                        - Robert Frost

With this awakening quote, Mr Rajesh impelled the session on introducing geniSIGHTS, by thanking his team for their dedication and hard work. He extended his gratitude to IIT M, Prof Ashok Jhunjhunwala, Mr Murugappan of Murugappa Group, Usha Narayanan, Mohan, Vishy, Coustomers, and Partners for their continuous support. Later he familiarized the guests about geniSIGHTS and clarified how geniSIGHTS could be utilized by organizations at an affordable and customizable platform that helps business scale analytics as per their requirements to integrate specific analytical solutions for a firm. GeniSIGHTS being specialized in building end-to-end BI/Analytics practice for customers, Rajesh vows that it could be deployed in customer premise or in a public or private cloud. With minimal engineering and maximum customization- even during the product implementation, Mr Rajesh claims that geniSIGHTS could be genuinely implemented, even with the provision of integrating to an already existing BI/Analytical platform.

Mr Rajesh revealed his thoughts on different platforms that geniSIGHTS possess as 4Cs– ‘Connect’ to extract, transform and load the data, ‘Customize’ based on the needs, ‘Compute’ the analytical engine and ‘Consume’ the analytical insights across seven domains such as retail, etail, travel, finance, telecom, bigdata and business analytics with more than 56 solutions.

Following Rajesh’s session, an activity named “geniConnections” was conducted to uplift the guest’s understanding about various features of geniSIGHTS in a form of a contemporary game instead of traditional approach. Being coordinated by Mr Ashwanth, the game had pictures that very indirectly depicts various glossary of analytics, which pulled out the guests to guess and decrypt the relation of pictures creatively.


The guests and dignities were honoured and distributed with mementos, later all were lead to pantry for celebrating the moments by cutting the cake. It was an emotional and cloud 9 moment for Rajesh, who had cut the geniSIGHTS concept cake on behalf of Aaum family. While the guests were enjoying the treat, the team went around and interacted with the dignities and the hall was energised with happiness and joy.


To have more clarity on how geniSIGHTS have its way in resolving the problems faced by companies across various industries, an interactive activity was harmonized by Mr Ashwanth, were the guests was proposed with various problematic situations which companies of respective industry comes across and in return they were to infer and throw back their insights and suggestions to solve the issue.


Later, Ms Sri Nithya and Ms Pavithra explained the geniSIGHTS way of solving the same with the aid of geniSIGHTS website. The session was concluded by Ms Parvathy, who briefed the traits and components of geni website.

Mr Sridharan J S, Co-Founder & Sr. Vice President of geniSIGHTS Pvt Ltd, was privileged to give away his gratitude through his vote of thanks for the gathering. Being a witness throughout the travel of geniSIGHTS, right from identifying the need to create a platform that would render insights, he has been around, when the challenges were thrown at the team for meeting critical requirement set to international standards, been along when the team discusses the wanting to find a name and brand ambassador, it was indeed a gracious and heart pounding moment for Sridharan. He took the moment to thank Mr Mohan, Mr Vishy, Mr Sriram of Servion, and Mr Vijay Babu for lighting the lamp, Miss Parvathy Sarath for welcome address and introduction of Aaum and geniSIGHTS, Mr Rajesh Kumar for presenting geniSIGHTS and giving useful details that evaluates the product and Mr Ashwanth for hosting uniquely wonderful event in engaging the audience.


A circle of strength, founded on faith, joined in love and kept together as family –The Aaum

family members swish wands to “Experience the Magic of Insights” for all.

For experiencing the magic of Insights please visit: http://www.genisights.com/

May 27 2016

Comprehensive Study of Hadoop Distributions


Anyone who is getting attention about big data probably will be aware of how hot Hadoop is right now. Hadoop is powerful open source software framework that makes it possible to process large data sets, doing so across clusters of computers. This design makes it easy to quickly scale-up from a single server to thousands. With data sets distributed across commodity servers, companies can run it fairly economical and without the need of high-end hardware.  The number of vendors has developed their own distributions, adding new functionality or improving the code base. Vendor distributions are designed to overcome issues with the open source edition and provide additional value to customers, with a focus on things such as:

  • Reliability -The vendors react faster when bugs are detected. They promptly deliver fixes and patches, which makes their solutions more stable.
  • Support- A variety of companies provides technical assistance, which makes it possible to adopt the platforms for mission-critical and enterprise-grade tasks.
  • Advanced System management and Data management tools –Using other tools and feature like security, management, work­flow, provisioning and coordination.

Several infrastructure vendors like Oracle, IBM, Cloudera,Hortonwork,EMC green Plump and other companies also provide their own distributions and do their best to promote their distributions by bundling Hadoop distribution with custom developed systems referred to as ‘engineered systems’. The engineered systems with bundled Hadoop distributions form the “engineered big data systems”.

There are some major players in the industry those are MapReduce, Cloudera, Hortonworks, MapR, IBM, Oracle, and EMC Green Plum.


hadooptable        Comparison of latest hadoop distributions

  • Amazon Web Services led the pack due to its proven, feature-rich Elastic MapReduce subscription service.
  • IBM and EMC Greenplum (now Pivotal) offer Hadoop solutions within strong enterprise data warehouse (EDW)  portfolios
  • MapR is best if you are looking complete Hadoop stack with all feature.
  • Cloudera include component such user interface, security, integration, and make administration of your enterprise data hub simple and straight forward. By using Cloudera manager you can centrally operate big data.
  • Hortonworks is the only vendor who offers all Hadoop open source services.



  1. Hadoop Distributions: Evaluating Cloudera, Hortonworks, and MapR in Micro-benchmarks and Real-world Applications by Vladimir Starostenkov, Senior R&D Developer, Kirill Grigorchuk, and Head of R&D Department.
  2. EMC federation BigData solution 2015.
  3. Using IBM InfoSphere BigInsights to accelerate big data time to value (IBM White Paper).
  4. http://www.wipro.com/documents/Hadoop-vendor-distributions.pdf
  5. http://www.oracle.com/technetwork/database/bigdata-appliance/overview/bigdataappliance-datasheet-1883358.pdf
  6. https://www.ibm.com/support/knowledgecenter/SSPT3X_2.1.2/com.ibm.swg.im.infosphere.biginsights.admin.doc/doc/c0057891.html
  7. https://aws.amazon.com/elasticmapreduce/


May 27 2016

Spark streaming vs Flink

There are ample amount of Distributed stream processing systems available in the market, but among them Apache Spark is being widely used by all organizations, it may be due to the fundamental need for faster data processing and real time streaming data. But with the rise of new contender Apache Flink, one begins to wonder whether they might have to shift to Flink from Spark streaming. Let us understand the pros and cons of both the tools in this article.

From inception Apache Spark (fig: 1.1) has provided a unified engine which backs both batch and stream processing workload, while other systems that either have a processing engine designed only for streaming, or have similar batch and streaming APIs but compile internally to different engines. Spark streaming discreteness the streaming data into micro batches, that means it receives data and in parallel buffer it in spark’s worker nodes. This enables both better load balancing and faster fault recovery. Each batch of data is a Resilient Distributed Dataset (RDD), which is the basic abstraction of a fault-tolerant dataset in Spark.


Fig: 1.1

Apache Flink (fig:1.2) is a latest big data processing tool known for processing big data quickly with low data latency and high fault tolerance on distributed systems on a large scale. Its major essence is its ability to process streaming data in real time like storm and is primarily a stream processing framework that can look like a batch processor. It is optimized for cyclic or iterative processes achieved by an optimization of join algorithms, operator chaining and reusing of partitioning.


Fig: 1.2

Both systems are targeted towards building the single platform where you can run batch, streaming, interactive, graph processing, ML etc. While Flink provides event level granularity while Spark Streaming doesn’t provide, since it is a faster batch processing. Due to intrinsic nature of batches, support for windowing is very limited in Spark streaming. Flink rules over Spark streaming with better windowing mechanisms. Flink allows window based on process time, data time, no of records that to be customized. This flexibility makes flink streaming API very powerful compared to spark streaming

While Spark streaming follows a procedural programming system, Flink follows a distributed data flow approach. So, whenever intermediate results are required, broadcast variables are used to distribute the pre-calculated results through to all the worker nodes.

Some of the similarities between Spark Streaming and Flink is exactly-once guarantees (Correct results, also in failure cases), thereby eliminating any duplicates and both provides you with a very high throughput compared to other processing systems like Storm. Also, both provide automatic memory management.

For example, If you need to compute the cumulative sales for a shop with specific time interval then batch processing could do it in ease but rather when an alert is to be created, when a value reaches its threshold level then this situation can be well tackled by stream processing.

Let us now take a deep dive and analyze the features of Spark Streaming ad Flink.


Though Spark has a lot of advantages in batch data processing, but still it has a lot cases to cater in streaming. Flink can process batch processing it cannot be compared with spark in same league. At this point of time Spark is much mature and complete framework compared to Flink. But it appears that Flink is taking big data processing to next level altogether in streaming.


May 24 2016

Intelligent Traffic Management – Applying Analytics on Internet of Things

With extension to the previous article “Revolutionizing the Agriculture Industry” this article talks more about how Internet of Things (IoT) and Analytics is going to revolutionize the Traffic Management in this modern world. Day-by-day roads are getting deluged with vehicles but the road infrastructure remains unchanged. Congestion in cities is cited as the major transportation problem around the globe. According to Texas A&M Transportation Institute, in 2011 traffic costs $121 billion due to travel delay with a loss of 2.9 billion gallons of fuel in USA alone.

The world is becoming more intelligent, where sensors in cars, roads are connected to internet and the devices communicate the data with each other. Using this intelligent, fleets can avoid accidents, predict car failure, preventive actions for maintenance can be taken, etc. Let us say hello to John.

Our John drives to office every day. His intelligent car gets data about various on-the-road events like accident, etc that took place ahead in his route and will tell guide him to take an alternative efficient route. With this intelligence John reaches his destination and his car informs him about the available parking slot! These events will not only enormously help John to save time and fuel but give him a tension free life. Are there any such services like this available already?


Meet Zenryoku Annai

Zenryoku Annai is a service which is provided in Japan by Nomura Research Institute (NRI). Using this service, subscribers all over Japan can plot out the shortest travel routes, avoid traffic snarls and estimate what time they will arrive at their destinations. It compounds information from satellite navigation systems linked to sensors at fixed locations along roads with traffic data determined through statistical analysis on position and speed information from subscribers, moving vehicles and even pedestrians. Meanwhile, data from thousands of taxicabs is added to the mix. Using all this information, Zenryoku Annai analyzes road conditions and helps drivers plan routes more accurately and over a wider range than is possible with conventional GPS systems. Since more vehicles are being added to it over a period, they were using in-memory computing technology, which has improved the search speed by a factor of more than 1,800 over the department’s previous relational database management system, that is 360 million data points can be processed over just in 1 second.


Fig: 1.1

Fig: 1.1 In a conventional SatNav system, road conditions are known at locations only where sensors are installed, but in NRI’s probe technology roads conditions can be determined much more accurately with position and speed data delivered from in-car units and mobile phones, sensors. Thereby it will also suggest the best alternative route for the user. Though Zenryoku Annai is not fully skilled, since all the vehicles in roads are not connected to it. But very sooner once can expect everything that runs on road to be connected to the internet.


Driverless Cars: Another example!

Driverless/Automation car is an ideal example, where IoT and Real time Analytics plays a crucial part. When a car goes down in a road it actually starts interacting with the signals, vehicles through various sensor points, accordingly it routes to the shortest distance itself by calculating the congestion & distance to reach the destination in a minimum duration. Pilot test run of Google cars (fig: 1.2) inferred that, it generates 1 gigabytes of data per minute from surroundings and these cars entirely dependents on IoT and analytics to take decisions.


Fig: 1.2



IoT with Analytics is in the very nascent stage and there are many barriers to restricts its growth. i.e. security of the data, privacy of the individuals, implementation problems and technology fragmentation. An average American commuter has spent 14 hours/year in 1982 but in 2010 it has surged to 34 hours/year, if this problem is unsolved, then it may even boost up to 40 hours/year. With the density of the population is exploding rapidly with the shortage of space, it is impossible to increase the capacity of roads but rather, a practically viable option is to use the power of data analytics on IoT.


Views expressed on this article are based solely on publicly available information.  No representation or warranty, express or implied, is made as to the accuracy or completeness of any information contained herein.  Aaum expressly disclaims any and all liability based, in whole or in part, on such information, any errors therein or omissions therefrom.




May 12 2016

Big lessons from big data implementation – Part I

Each day 23 billion GB of data are being generated and the speed of generating big data is going double in every 40 month! Apart from their business data, organizations now also have humongous data available from google, Facebook, amazon, etc. They wish they can use all the available data to find useful information for doing their business better.  Let us look into big data deployment of a few organizations and learn from their experience.

Case 1: Rabobank

Rabobank is a Dutch multinational banking and financial services company headquartered in Utrecht, Netherlands. It is a global leader in food and agro financing and sustainability-oriented banking. Rabobank started with developing a big data strategy July 2011. They created a list of 67 possible big data use case. These use cases included:

  • To signal and predict risks, prevent fraudulent actions that the bank is running
  • To identify customer behavior  and to obtain a 360-degrees customer profile;
  • To recognize the most influential customers as well as their network;
  • To be able to analyses mortgages;
  • To identify the channel of choice for each customer.

For each of these categories they roughly calculated the time needed to implement it as well as the value proposition. In the end the Rabobank moved forward with big data application to improve business processes as the possibility for a positive ROI. A dedicated, highly skilled and a multidisciplinary team was created to start with the big data use cases. They were using Hadoop for analyzing big data. They selected social data, open data and trend data were integrated. So there data approach with a deluge of semi and unstructured mess. Hadoop is only part of a big data strategy. The key to success was the multidisciplinary team and that they embraced uncertainties and accepted mistakes to be made help them to overcome situation.

Problems faced during implementation

Rabobank didn’t store raw data, due to the costs and capacity issues. The data quality was not constant and the security issues were very high. Rabobank noticed that it was often unclear who owned the data as well as where all data was stored. Hadoop is different from older database and data warehousing systems, and those differences confused the users.


  1. Specialized knowledge as well as visualizations is very important to drive big data success.
  2. Start with the basics & don’t stop at stage one. Big data implementation is continuous journey to reap data-driven insights.
  3. Not having the right skills for the job can be a big problem.
  4. The dangers of underestimating the complexity of a big data system implementation so focus on data management.


In 2010 the newly elected president of the United States of America government introduced Patient Protection and Affordable Care act. The main purpose of this act was the best of public and private insurance coverage for the population, and thereby controlling and reducing healthcare costs and requires them to interact with the government via a website to do so. The system is in essence a big data implementation problem with data being collected on a potential population in Excess of 300 million people across the entire country. Unfortunately the project not progressed as planned, and has become mired in technological controversy.

Problems faced during implementation

  • This act brought the country to default on its debt
  • Cost of Obama care – $1.6 trillion
  • Estimation of cost 2014-2024 – $ 3.8 trillion

Anticipation to prevent the problem:

  • They can take special knowledge as well as visualizations can prevent the loss.


  1. The dangers of underestimating the complexity of a big data system implementation to focus on data management.
  2. The prior analysis and prediction complexity of data can prevent cause.
  3. Most of the data collected and stored in an agency’s transaction processing systems lacks adequate integrity so make sure that captured data meet integrity standard.
  4. Specialized knowledge as well as visualizations is very important to drive big data success.
  5. Not having the right skills for the job can be a big problem.


We shall analyze a few more cases tomorrow. Keep watching this space.


Views expressed on this article are based solely on publicly available information.  No representation or warranty, express or implied, is made as to the accuracy or completeness of any information contained herein.  Aaum expressly disclaims any and all liability based, in whole or in part, on such information, any errors therein or omissions therefrom.


  1. The process of big data solution adoption By Bas verheji.
  2. Case study on Rabobank by Hewlett Packard enterprise.
  3. Big data for all : Privacy and user control in age of analytics by Omer Tene and Jules Polonetsky
  4. Realizing the promise of big data implementing the big data projects by Kevin c Desouza
  5. Case study on Obama care big data problem on patient protection and affordable care act
  6. http://www.mckinsey.com/business-functions/business-technology/our-insights/big-data-whats-your-plan.
  7. http://www.businessofgovernment.org/blog/business-government/implementing-big-data-projects-lessons-learned-and-recommendations.
  8. https://hbr.org/2012/11/the-analytics-lesson-from-the.
  9. http://dataconomy.com/hadoop-open-source-software-pros-cons/.


May 11 2016

Revolutionizing the Agriculture industry – Applying Analytics on Internet of Things

Analytics on Internet of Things

Internet of Things (IoT) and Big Data analytics could be the two most buzzed terms in the Industries for the past 2 years. IDC has forecasted to yield $8.9 trillion revenue by 2020 in IoT and Goldmann Sachs has estimated that 28 billion devices will be connected to internet by 2020, which indicates that each of this connected devices will shoot back with humongous amount of data each second, therefore you need a proper analytical tool/solution for the “Value creation”.

Introduction to IoT:

IoT is a network of inter-connected objects able to collect and exchange data. This eco-system enables entities to connect to, and control, their devices. The information that performs the command and/or sends back the information back over the network to be analyzed and displayed on the remote. For example: At 6am John’s phone receives a mail informing that his meeting has been pushed back, now his mail service tell his smart clock to give him an extra 30 minutes of sleep and alerts him to the change once he wakes. This is how the whole system works, but with the addition of Analytics to this prototype will enable the firms to take an efficient and best decision with the available data.

Analytics on IoT in Agricultural Industry:

How will this trend influences agriculture industry? The answer is, it is going to entirely revolutionize the whole working pattern of the industry rather than just influencing. Because in 2050 global population is expected to reach 9 billion people (34% higher than today), in parallel food production should be increased by at least 70%, but in other hand U.S Department of Agriculture states that 90% of all crop loses is due to weather related incidents. Thus, if we could minimize the losses due to weather related incidents and use the limited fresh water resource as draught is prevailing across the globe (to the fact 70% of the world’s fresh water is already being used for Agriculture). And so, this issues can be resolved by using predictive Analytics, which is going to play a major part in building value creation. Predictive analytics acts as a central element in predicting the future picture which requires a lot of input data from various distinctive variables. The basic idea is to identify and differentiate between the high and low yielding crop lands by measuring its productivity.

Flint River Valley project:

Hyper local forecasting techniques will assist farmers to overcome the above obstacles. The Flint River Valley is a part of Georgia’s agricultural industry, it roughly contributes around $2 billion annually in farm-based revenue. A pilot test run is been made in Flint River valley in USA, by researchers from the Flint River Soil and Water Conservation District, the U.S. Department of Agriculture, the University of Georgia and IBM. The primary objective is to give farmers a beneficial information about the weather through analyzing the data obtained from various sensors (fig: 1.1) which is installed over fields. Here sensors collect data like temperature, moisture level in air and surroundings which is made to blend with satellite data. They use this data in Variable Irrigation Rate technology which enables the farmers to conserve water using sprinklers which will turn water off over areas that don’t need water and turn them back on over areas that need water.

Fig: 1.1


Fig: 1.2


Fig: 1.2 shows the cloud water density, that is water content in the cloud which is important in figuring out which type of cloud is going to form and helps to determine the cloud formations that are likely to occur, which is extremely useful for weather forecasting

According to IBM, farmers will be able to track weather conditions in 10 min increments up to 72 hours in advance. And a full 72 hour forecast will create data around 320 gigabytes but while each individual farmers will require a small tranche of it in a personalized way. They are also building a weather model with 1.5 kilometer resolution for the farmers. It is estimated to save 15% of the total water that is been used in irrigation that is about some million gallons per year. The costing comes around $20 – $40 per acre for first 3 years.

 Fig: 1.3


With geospatial mapping, sensors and predictive analytics, farmers will be presented with real time data in time series, graphs at a granular level. Soil quality, field workability, details on nitrogen, pests and disease, precipitation, temperatures, and harvest projections with even predicting the expected amount of revenue in relation to the commodity’s market trend and all of this is been analyzed and reported via a smartphone ( Fig: 1.3 ), tablet, or desktop. In future it may even become mandatory to use IoT and Analytics in agriculture industry for sustaining and growth, in order to maximize its revenue in multiple folds with the minimal use of resources.


The IoT is on its way to becoming the next technological revolution with $6 trillion to be invested before 2020, and a predicted ROI of $13 trillion by 2025 (cumulative of 2020-2025). Given the massive amount of revenue and data that the IoT will generate, its impact will be felt across the entire big data universe, forcing companies to upgrade current tools and processes, and technology to evolve to accommodate this additional data volume and take advantage of the insights.




May 10 2016

Dawn Of Online Aggregators – How Business Analytics enabled them

Online aggregators are websites which bequests in the e-commerce industry by stockpiling the information about various goods and services and conglomerate them from several competing sources, in their websites. The aggregator model assist consumers bestowed, customized, tailored, and cater for the needs and wants of the consumers, later adding value to their feedback and services by revamping their shopping experience.

Who are the Online Aggregators?

They have been graciously welcomed by both the end players- customers and businesses, since it enhances the sale and make a good reach of the product/service to the customers, thus and thus benefits the end players. They have been lustering across many industries like travel, payment gateways, insurance, taxi services, or some firms open up secondary market over internet like letgo, Locanto, vinted and in food ordering services like    Campus food, Gimmegrub, Diningin, GetQuick et al.

Business Analytics – As a key factor for the aggregator’s triumph

For bringing in such a dynamic yet strong change in the e-commerce industry, from what has been traditionally followed, what pitch would have had the aggregator firm taken up?  What paved as a base for the firm to venture in this space? It is possible by unlocking the marketing data and turn its inside out. This job could be well chipped by business analytics, since it could be portrayed intersection of data science with business.

Business Analytics is the study of data through statistical and operational analysis. It focuses to bring out new insight based on the data collected and utilize them in enhancing the business performance. It is closely related to managerial science, since it is extensively fact-based explanatory and predictive model using statistical analysis to drive management in decision making.

Uber- A study on how Business analytics has augmented their business 

In an online aggregator like Uber, this is spread across the world Business analytics plays a crucial role. Uber is an app based technological platform which links the passengers who are up to hire a taxi, to a driver who is ready to assent a ride and Uber takes 20% of the cab fare as its commission and the remaining is pocketed to the driver. The firm has rooted its trial across 444 cities worldwide. In a city like New York it’s whisked around by 14000 thousand Uber cabs while the unorganised hold 13500 taxies. In Los Angles among 22300 cabs 20000 are registered under Uber.

Market share of Uber

Fig 1: Market share of Uber


Uber maintains a huge data of all its drivers, users and the details of the every city in which it exists so that it can instantly match the passenger with the driver who is nearby. In USA, the traditional taxi meters charge the passengers based on the duration of the rides. But, Uber follows a patent valid algorithm that use the particulars of distance and the duration of the trip. This technique which is used by Uber is called Surge Pricing. A lone feature of this technique is that, the price is multiplied in terms of the surge time, which is when the traffic is overflown, for which the firm had to make a study on the traffic in the New York City (Fig 2). But the passengers are previously warned about the magnifying rate over the normal once. This algorithm advocates the driver to stay back at home due to the shortfall of rides or encourage them to get behind the wheels to gain the extra money while the city is in traffic.

Traffic hours in the New York City

Fig 2: The analysed result of Traffic hours in the New York City and the Surge price that the Uber app informs the passenger about the multiplied charge.


In a study made, Uber inferred that in New York passengers are travelling from almost same locality to almost same destination. When surveyed, majority of them agreed to share their cab even if it’s a stranger. Uber Pool is a service provided by Uber, where a passenger could track another passenger who is waiting to aboard, on the way to be picked up and have to reach almost the same destination. This lets a passenger share the cab with a stranger and cuts down the cost.

Operation of Uber pool

  Fig 3: Operation of Uber pool

Uber lets the passenger rate about the driver by the end of every trip based on this knowledge about the city roads, professionalism, driving ability, car quality and punctuality. This is to evaluate the driver and educate them with the skills or even at an extent to rusticate them from the service.

Distribution of drivers by rating and self-assessment chart to the drivers from Uber app

Fig 4: Distribution of drivers by rating and self-assessment chart to the drivers from Uber app.



In the case­ of Uber’s journey on being an online aggregator, the business analytics have come in handy to evaluate them among the market players, to know about their customer needs and driver’s attitude, to come up with a new strategy like Uber pool letting the passengers to share the taxi and its fare, Rating system to make managerial decision in favour of or against the taxi drivers, to come up with a contemporary pricing method – surge pricing, which charges the passengers based on the changes in demand.

A firm is bound to realize business analytics while making over any managerial decisions, when making a strategic move, when casting a new product or services because this unloads the assumptions and gives in a firm and statistical data in various business obligations. It helps a management make the decision faster and improves the critical performance with the precise data in hand. Business analytics is committed and aids to procure, sustain and reduce the churn rate of any business entity. The analysis subsidize more insights about the market and find the target customers, evaluate the impact created over them due to the changes made in price/ service over the product and realize their expectations. So, Business analytics would be the brain of an organisation to take proactive decisions and plan the business for maximum success by looking into the future.


(The data and information used in this article are utilized from the referred sites and documents, and are not self-generated.)


















Sep 19 2015

Building Data Science in your organization – Is it really important?

Analytics for businessDoes data science bring value to your organization? Hang on, what is this data science really? Why should we really care about? You got to really care because this one might change the way you run your business in the near future whether you like it or not. Just like Software/IT changed the world a few decades before!  IT is omnipresent and those who didn’t care to change the wheels suddenly had to pay steep price to change their course. Today, thankfully the cost of IT adoption is very low since the industry is much matured to provide right solutions at low price. But the initial period was very crucial. People were extremely cautious about implementing, evaluated ROI, questioned why they should invest considering there were new systems, servers, recruitments, etc. Those were the times where clear ROI from IT cannot be calculated. IT was nascent. There were gross blunders like Y2K issues. But the world got changed and IT has touched almost all facets of life.

Whether we like it or not, data science is going to bring another transformation in the day-to-day activities. There is a wide perception that data science is applicable for bigger organizations and not for small and medium businesses. Of course, there are successful early adopters. A few have burnt their fingers with analytics adoption. Money ball showcases how Billy bean uses data science effectively for making a good team to compete in base ball. Of course, there are several criticisms for the same strategy in other games. The success requires careful adoption to the business context.

The analytical adoption is getting more matured. It is time for the companies to realize quick data driven insights, if they want to achieve a competitive edge over other companies. How should one adopt the analytics strategy? Come and attend our analytics event “Analytics for CXO” -a must not miss program for the CXOs who want to adopt/implement analytics for their organization.

  1. Update yourself with bleeding edge analytics developments happening in the industry
  2. Customized Analytics roadmap specific to your organization/department
  3. Consultation with industry experts – subject to prior appointments.
  4. Practical use cases with analytics implementation, benefits and value
  5. Focused group discussions on the benefits , challenges, issues, faced by the organizations
  6. Learn best practices from the industry, academia and peers
  7. Analytics Jumpstart Kit – Small data or big data. This Kit is definitely a must have for your organization.

Registration @ Explara -  https://in.explara.com/e/analytics-for-cxos
Date: Saturday, November 21, 2015, 9:00 AM to 5:00 PM
Venue: IIT Madras Research Park, Chennai

Mar 14 2015



The 2015 DATA SCIENCE Conclave, took place on February 20-21 at Hotel Rain Tree, Chennai and featured keynotes, panel discussions, breakout sessions, “lightning” talks, etc.  A special thanks to our sponsors Target India and Contact Singapore for really making this event a grandeur success!!

Agenda for enabling the Data Science Conclave…


DS3The Event opened with a featured keynote address by Mr.Rajesh Kumar, the Founder & Managing Director, AAUM Research & Analytics Pvt Ltd.Rajesh emphasized the need for data science practices in organizations and how insights from datascience is transforming the data-driven world.

DS4Taking it forward, Ms. Parvathy Sarath, Director & data evangelist at AAUM, with extensive experience in finance, retail, social media, human resource and Government, opened a featured presentation on Data Science –learn, develop and deploy with her Lead Data Scientist Mrs. Praveena Sri. They explained the importance of R-tool, understanding the data through R-tool, Data visualization and Predictive analysis and how beneficial it could be for an organization in the longer run.

DS5In the second session of Data Science-learn, develop and deploy analytics in your organization, Mr.Rajesh Kumar and Ms.Parvathy Sarath dealt with the topics like Logistic analysis, Multivariate analysis, Decision trees using R-tool.
Mr.Bala Chandran
, a Hadoop developer with AAUM, had come up with the presentation on Big Data Analytics, Explosion of Big Data, tools and techniques involved in exploding Big Data.

DS6Mr.Elayaraja and Mr.Sankar Sundaram of Mobius Knowledge Services came up to integrate analytics to Big Data. Their agenda included Web evolution, Web Pattern matching, NoSQL and Probabilistic models.

DS7Mr.Bala Chandran from AAUM Research and Analytics, opened a featured presentation on Cloud for Data Science. He covered various topics like computing platforms, cloud services, Cloud Deployment tools.
These sessions brought together selected experts from around the corporate world to take opportunity of presenting their knowledge leading to a better understanding of specific challenges and opportunities for Data Science in the sectors of society and economy.

DS8The last session Do it Yourself, built on conversations and work done in the previous sessions of the day, helped the participant to test his understanding on Data Science and ensured inclusion and broad participation so that everyone gets benefited from the conference .


In parallel fashion, the second day events moved away from broadcast formats, that treated everybody the same and evolved towards discussions that allowed the individual participants to learn what they needed to learn, as well as connected with peers and peer organizations that have real value for them.

The Day was started by Mr. Naveen Gainedi, the senior group manager of Analytics and Reporting of Target India, had come up with the Data Science practices of their organization. Their presentation facilitated focused discussion on the Elements of a Data Science Practice and Building Data Science teams.

To look at the other side, how a startup company has built Data Science Practices in their organization, Mr. Velumurugan, a head of Big Data Analytics Practice of Altimetrik had come up with resources, challenges, processes and frameworks of establishing data science practices.

The Panel Discussion-1 on Technology Spend was headed by Ms. Bharathi Muthu, the General Manager of IBM software market management for South Asia. She did an exceptional job with bringing the delegates in the panel discussion and managed to keep them engaged throughout the entire talk. The other panelists are the delegates from top companies like Mr. Nitin Chaudary, the head of Products and Technology function of Samunati, Mr.S.M.Bala Subramaniyan, Mentor and Strategic Advisor and Mr.Satya, Manager at Hp Analytics.


The Panel Discussion-2 on Predictive Analysis was headed by Mr.Naveen Gainedi of Target India. He moderated the panel discussion along with other three panelists Mr.VRK Rao from CTS, Mr.Karthik Karunakaran from Mobius, Mr.Velmurugan from Altimetrik and showed how to ultimately engage the participants and organize a captivating panel.


The Panel Discussion -3 on Machine Learning was headed by Ms.Madhumitha from Wikimedia. She clearly showed that herself and the panelists, Prof.Ronojoy Adhikari from Institute of Mathematical Science and Mr.Dorai Thodla, a chief mentor at Build Skills, made the participants become active contributors in the discussion.


AAUM, Target and Mobius teams came up with their real-time case studies to create classroom environment for group analysis and discussion, while simulations immersed participants into an experiential situation.

DS12To summarize the entire two day event, an Insightful interactive session was handled by Mr.Rajesh Kumar and Mr. Bala Subramanian to improve learning, interaction and engagement among the participants.

DS13Finally, the time to the end of the conference!!
A deep sense of appreciation to all the speakers and participation was given by Mr.Sridharan, Vice President of AAUM Research and Analytics.

It was indeed a great event that had like minds gathering in one place and ultimately having the common goal of learning, Developing & Deploying Data Science in the Organization.


This event was fittingly an event focused more on evolving Data Science as a trend setter for future Analytics and to change the entire way of looking at business excellence in rather more insightful way.

With more & more events to come, it definitely paves way for a new era of Analytics for business..!! An Insightful Business..!!


Aug 8 2014

Walmart B2B online – An opportunity for Indian Retail?

Customers shop at a Best Price Modern Wholesale store, a joint venture of Wal-Mart Stores Inc and Bharti Enterprises, at Zirakpur Is organized retail killing the kirana shops in India? Things were pretty different a few years ago. Many people declared end of the days for kirana Shops. But we have not seen any such major change but rather a few very interesting occurrences. The organized retail outlets did kick off in great way but did not kill the Mom and Pop shops. Infact, everybody grew in the booming Indian economy. Long story short… Mom and pop shops not only survived, they scaled their operations and expanded in more locations! Enter Walmart. What people mentioned about Walmart’s business In India, a few years ago was totally different from Walmart’s B2B operations. Walmart’s B2B business exclusively targeted Small/Medium business. Walmart B2B enabled Kirana store owners to become “Best Price Modern Wholesale stores” and provided benefits such as:

  1. There are an estimated 12 million kirana stores in India of which as much as 90% are not directly serviced by India’s FMCG majors. Best Price stores offer them access to quality products at the lowest prices they need and when they need them.
  2. Assortment, service and store layout of Best Price stores are customized to their specific needs to help them get the benefit of high quality products at best prices to enhance their business profitability.
  3. Best Price stores help kirana stores manage their inventory better by enabling them to purchase in quantities they need and at the time they require. They can hence take advantage of Best Price store by using it as their own godown, thus freeing up their capital for business rather than lock it up in inventory.
  4. Best Price stores have also created an innovative kirana model “My Kirana” that is tailored for kirana stores to provide them training and insights into areas such as assortment planning, hygiene, in-store displays, inventory management, value added services etc.
  5. In addition, different education programs for members with customized modules like taxation, food preparation, food safety and category workshops have also been introduced for different target segments.

Walmart has indeed adopted an approach to benefit small/medium business and to get benefit from their growth. This definitely helps the kirana shops to gain a very good opportunity to procure quality items at the required/convenient time. Now they are extending this in e-retail format.

imagesWell, this also creates a huge opportunity to these local shops to scale to the next level if they could adopt innovative strategies. Yes, data driven insights will definitely come to their rescue. They definitely need to protect their loyal customers against the competition to sustain and grow in this market. Those who quickly embrace, will stand out and emerge winners in the neighborhood. These Davids can definitely win Goliaths. Data driven analytics strategies will help!

Aaum’s geniSIGHTS solution has been helping retail customers to quickly embrace analytics and power their day-to-day business operations with  intelligent decisions. Know more about Aaum’s retail/eTail operations at http://genisights.com/retail/ and http://genisights.com/ecommerce

Reach out to the author by mailing your queries/suggestions to info@aaumanalytics.com