Showing posts with label big data appliance. Show all posts
Showing posts with label big data appliance. Show all posts

Tuesday, October 9, 2012

Using Lean Agile Methodologies for Planning & Implementing a Big Data Project @ "Data Informed Live!" on Dec. 10


I am scheduled to speak at the "Data Informed Live!" event being held at San Jose, California on December 10-11, 2012 at the San Jose Mariott. The event is focused on planning and implementing big data projects. This 2 day event targets business and IT managers with a goal of imparting them with the knowledge they need to develop and execute a "big data" plan for their companies.  

Click here to register.
 
The first day of this two day event is dedicated to planning aspects while day 2 focuses on implementation success factors. I am speaking on day 1,  and my talk is about  using lean agile methodologies for defining product requirements. Everyone knows that project requirements change for data and software related projects as things start taking shape, specially when the project involves new concepts and technologies such as big data, yet most traditional project management treat requirement changes as exceptions. I will be talking about how the agile requirement gathering and product design approach embraces change; and because change is anticipated in the agile project development frameworks, it allows projects to stay on track.   

I believe that  traditional requirements gathering processes does not work for big data projects because the end users can't yet fully grasp the full capabilities and power of big data and hence can not describe they need. 

An iterative agile approach where requirement gathering, design and implementation are done in small (2 to 4 week long) iterations allows end users to visualize what can be done and what is needed and help development team understand how long it takes. It also allows the project to continue to move ahead while providing flexibility to accommodate changes as end users discover new requirements and developers figure out technical nuances. My session will explain how the agile approach works, provides advice for using it, and gives real-world examples of how others have used it successfully.

Whether you are planning your "Big Data" project, or implementing it, "Data Informed Live!" will  prepare you for achieving success in your endeavors by covering the following critical issues: 

- Process: The key processes which both impact and will get impacted by the proposed big data project
- Organization: How to design and re-engineer your organization to implement and utilize big data
- Tools, Platforms and Technology: Understand what platforms and tools can be utilized to assist you in the design and implementation of the big data project.  

Click here to register or to find out more. 





advertisement

Looking for a Copywriter in Denver Colorado? Look no more, Michelle Lopez can help you! 




Friday, May 11, 2012

Oracle Unfolds Details of Its Arsenal of Speed-of-Thought Big Data Analytics Tools

May 3, San Francisco – Today, Oracle laid out details of its “speed of thought” big data analytics suite at its Big Data and Extreme Analytics Summit.

The Oracle Exalytics in-memory BI machine forms the core of the analytics suite. Oracle Exalytics is a co-packaged combination of Sun hardware and various in-memory analytics software tools which have been co-designed for optimal combined performance. The set of analytics tools include Oracle’s in-memory BI Foundation, an architecturally unified business intelligence solution which caters to reporting, ad hoc query, and analysis needs. It also includes the Endeca Information Discovery tool, which was acquired by Oracle in December of last year. This combination of co-optimized hardware and software is touted as the “speed of thought” analytics which allows advanced, intuitive exploration and analysis of data – whether structured or unstructured.
 
Oracle Exalytics can work against the data from both the Oracle Big Data Appliance (a distribution of Hadoop with matching co-designed Sun server) and Oracle RDBMS.

Endeca, a suite of applications for unstructured data management, Web commerce, and business intelligence which was acquired by Oracle last year in December, has been positioned as the primary information discovery tool in Oracle’s BI suite.

The combination of Oracle Exalytics hardware and matching BI software gives businesses access to both structured and unstructured information from different sources using just one interface, thus giving them the ability to gain a deep, wide, and on-time view of their customers and allowing them to do in-context exploration of their business data in a timely manner. In addition to answering questions of whether a business has increased its revenues or missed its sales targets, businesses today want – need – to know why.
 
In the past, to gain such information, businesses have had to go to separate data sources such as promotions data, time frames, locations, channels – and still manage to get only an incomplete view of what they’re looking for, as other influencing factors such as current events and climactic changes may inevitably get missed.

With the Oracle Exalytics In-Memory machine and Oracle BI tools, businesses will be able to gather information, both structured and unstructured, from a far wider spectrum of sources – survey companies, CMS systems, customer reviews, tweets, news reports – and consolidate them all into one single view. And then, to facilitate big data analysis, the software allows you to arrange certain data according to demographical categories such as gender, age, or location.

The key features of the co-engineered Oracle Exalytics In-Memory BI machine and software were described as

  • advanced data visualization and exploration to quickly provide actionable insight from large amounts of data. Oracle claims that the software can be used by somebody with no previous training, thanks to its user-friendly interface;
  • the software allows businesses to download data as is, without need for costly cleansing, so iteration and evolution can be accelerated;
  • faster than competitive solutions for business intelligence, modeling, forecasting, and planning applications; and
  • comprehensiveness. The hybrid search/analytical database was designed to be able to collect and compile all the information – regardless of source, format, or type – that a business needs to make informed critical decisions.