Questions & Answers

  
  • Answer the Question
    Tom Perry
    October 25th, 2016 11:45 PM

    Same was the case with me. There is no issue, I have been using the big data power in our salesforce system. Read the below use cases it might clear the clouds hovering around your mind.

    With the range of popular services such as Sales, marketing, Chatter, Desk and Force.com, Salesforce is the premiere cloud computing solution. It faces around  billion transaction per day through API, mobile and web. For internal and product use cases, gathering event data is essential. Salesforce gather events at the central location. These events are collected though the logging and application instrumentation, while for each logged event, we collect the relevant information. It renders the ability to mitigate the performance problems, check the  application errors, usage and adoption for the Salesforce code base. We use a combination of log mining tools like and Hadoop- for analytic use cases and Splunk- for operational use cases.

    Further, talking about Hadoop, it is an open source MapReduce technology that is used to answer queries on large data sets. I have personally used it for product metrics, capacity planning, chatter file and user recommendation. Product Metrics is used by Product Managers to understand usage and adoption of features. The Product Metrics enable managers to make decision based over trends and define features, log instrumentation, standard set of metrics, visualize and measure. While in Salesforce, Custom Objects are used for defining features and log instrumentation. The custom java programs are written to auto generate the Pig scripts, that are used to extensively mine through the logs on an ad-hoc basis. These scripts run on Hadoop clusters, aggregating data and storing daily summaries in another custom objects. Consequently, the data is visualized through Reports and Dashboard in a self service way. 

    Collaborative filtering is an algorithm for many recommendation systems on Amazon, Twitter, Facebook, and other popular sites. We adapted the Amazon’s item to item collaborative filtering algorithm, and chose an approach that is community based. This avoided the “cold start” problem for a new user in the enterprise. We used the algorithm on the Hadoop cluster. In the fast paced industry, the big data technologies have become necessary to the success. 

    Oman Khan
    October 25th, 2016 11:43 PM

    Adaption of new technology always accompanies some new challenges as well. Big data tools are made to seamlessly integrate with your business model without losing their efficiency to process the data in amazing manner revealing the opportunities for future decisions. But still Big data makes you encounter with significant challenges: 

    • Data silo and complexity challenge – an predictive analytical application sifts the data from internal/external sources such as relational and semi-structured XML, dimensional MDX and Hadoop.
    • Query performance challenge – Big data is all about large volume of data. This data must be analyzed making the query performance a critical success factor.
    • Agility challenge – Most of the time it requires new and ever changing analyses. It means that the new data sources are required to be brought onboard as early as possible and existing sources are modified to support each new analytical requirements.

    Mind these challenges, if you are adapting big data.

    Stephen Fleming
    October 25th, 2016 11:42 PM

    I think it depends over your existing system. If you are habitual of playing with huge clusters of data then it might be easy for you to manage the space to store the huge amount of data. What you exactly need is just the right tool at right place to analyze and process the data. In order to derive the results, you need to hire data scientists. There is no doubt that big data is highly capable to derive the graphical and visualized analytical results for you mining the data that is unstructured and hard to be managed by your traditional database systems. 

    Analysis of Big Data including the data the type of data that have not been analyzed before, offers the deep level of insight to what are customers trends and the business operations. The potential payoffs are enabling customer retention, sale of products, high quality production and lower rates of return. As the market is growing, small enterprises and large businesses, both are adopting the Big data technology without any blink of an eye. And according to me business is subjected to risk, so don't worry just invest your resource over the right business intelligence tool. I assure you that big data can really improve the bottom line that impact the overall profitability. Everything is the same, just the basic difference is that big data is the new data source that has not been collected, captured, structured, analyzed and scrutinized before. While the core advantage you notice is that the large volumes of data are not necessary to be found structured and managed that makes it impossible to be managed by traditional data base system, creating a challenge. 

    If you are using any traditional database management system, the business intelligence tools is quite new things to you. But you will have to adapt the change at any point of time. I recommend you to analyze your business requirement and try to scale up you business, if it is possible. And in case you need a big shift to the technology, take suggestion from experienced IT professionals and big data service provider. Hire a data scientists and integrate your existing system with the any big data tool of your choice. Hope you understand my point.