Recap of Tweetchat: "In-Memory for Big Data Management " - Part 2
svisser1 2700018UK9 Visits (3600)
This is part 2 of the In-Memory for Big Data Management Tweetchat Recap. See also: Recap of Tweetchat: "In-Memory for Big Data Management " - Part 1
BigDataAlex A4: HPC, next gen chip design, less I/O disk functions in our code, converging toward better scientific computing.
jameskobielus A4: data scientist can ingest, regress, visualize, explore, model, score, iterate & deploy stat models more rapidly
troycoleman I currently see in-memory at the moment for BI. On z/OS I see the Netezza connection to DB2 as the way of achieving this.
jeffreyfkelly A4 less trips to the watercooler waiting for query response
BTRG_MikeMartin Data Science is the practice of deriving insights from data to solve business problems
InfoMgmtExec A4 - Decision Scientists spend way too much time today conditioning & gathering data. In Memory can have it all in one place.
Natasha_D_G A4: Data scientist gain major advantage when they can access & digest massive amts data in secs.
dfloyer Using flash in conjunction with DRAM increases the scope of problems tackled and improves recoverability dramatically
furrier one issue is counterfeit Flash NAND devices data recovery not possible is a ‘healthy’ industry of counterfeiting Flash NAND
jameskobielus A4: data scientists can refine models far more rapidly if they hold all or most o relevant working data in fast RAM
BigDataAlex A4: Fire Scientists in Montana are using in-memory computing to better understand wild land fire given a changing climate.
jeffreyfkelly A2 in-memory allows Data Scientists to ask more questions, to quickly refine questions, and to more quickly find answers
johncrupi Shouldn't we start talking about "scale-out in-memory". Single box, "scale-up in-memory" isn't going to help the #BigData cause
Natasha_D_G A4: When data scientists can find answers 2 questions they didn’t THINK to ask it’s a win
zacharyjeans I don't know the difference between 'scale out' & 'scale up' in memory.
TerraEchos In memory allows the analyst and #datascientest to reduce the workflow with great analytical depth - win/win
jameskobielus A4: ask more questions more rapidly against more of data, in-mem, helps data scientists drill deeply to patterns
jameskobielus A4: Yes but right now we're talking apps in data scientist productivity, not specific scaling approaches 4 in-mem.
Ercan__Yilmaz A4. To the effect that it improves data munging and visualization, it helps
jameskobielus A4: Yes, but right now we're talking apps in data scientist productivity, not specific scaling approaches 4 in-mem.
furrier IBM CEO says Big Data a priority- she is smart
BTRG_MikeMartin In- Memory can be expensive and limited in volume
InfoMgmtExec A4 - In-Memory allows DS to create a "Memory Palace" for Models, A/B Tests, Algorithms in development, etc. All in real-time.
BigDataAlex A5: Flash memory is cheap, and getting cheaper.
jameskobielus A5: Yes, in-mem more expensive acq than HDD, but coming down rapidly. Cost per IOPS, though, in-mem cost-effective
zacharyjeans A5: In-Memory must either serve a mission critical system, or profit the company via efficiency gain.
BigDataAlex A5: It would take only one shelf of a flash-based storage system.
BigDataAlex A5: It takes 4 racks of disk storage to create a system capable of 1 million IOPS, or input/output operations per second.
InfoMgmtExec Reductions in latency well worth the cost factors.
johncrupi In-memory is the means to the end. #Qlikview marketed in-memory for years before they realized DIY analysis was the real goal.
jeffreyfkelly A5 hybrid approach - in-memory/disk - often needed to make economics work
katsnelson A5 right cost model for the right type of data. Nothing is cheap or expensive on its own. Too expensive for something
zacharyjeans A5: We wouldn't even be talking In Memory solutions today if the price for RAM wasn't becoming so reasonable.
dvellante A5. Isn't it really a balance? - hierarchy of media from in-m
BigDataAlex Energy consumption would drop by 80 percent since memory-based systems consume less energy and require fewer air conditioners.
johncrupi I really want scale-out with scale-up in-memory. I don't want 1000 nodes each managing 32G of memory. I want 10-20 managing 2T.
InfoMgmtExec A5 Economics self-evident. Living in real-time world using tools that are not real-time. Reducing Latency to Zero is end game.
dvellante A5. Best economic solution is intelligence in file sys where active data svcd fm fast memory and slow data is in the bit bucket
TerraEchos an opportunity cost? @BTRG_MikeMartin: Like EVERYTHING it is about finding the cost/benefit.
furrier A5: open source impacts the economics when talking mission critical; sw written to live in-memory is paradigm shift #disruption
dvellante A5. imho less a matter of $ + more case of biz impact. If biz case=excellent $ of in-mem is irrelevant
zacharyjeans I really want scale-out w/ scale-up in-memory. I don't want 1000 nodes each mng 32G of memory. I want 10-20 mng 2T
BigDataAlex A6: IMC can help folks leverage their data warehouse - rewire the house for speed.
jeffreyfkelly A6 back to economics - don't need your entire DW in-memory - use in-memory to supplement trad DW workloads
dvellante A6. DW/BI for years has been like a "snake swallowing a basketball" -in memory is critical to solve this problem
InfoMgmtExec A6 In Memory EDW is Holy Grail. Makes EDW more of "real-time repository" that can better serve Operational & Analytical needs.
dvellante A6. Ask any DW practitioner and they'll tell you a story of "chasing the chips"
zacharyjeans A6: I don't know the answer. What are the stability issues with long term storage on physical media vs In Memory solutions?
jameskobielus A6; In-mem is optimal for front-end in 3-tier DW architecture. RDBMS is hub tier. Hadoop/NoSQL is staging tier.
CuneytG A6 in memory analytics is needed if you expect fast reply from dw supported by hadoop
jameskobielus A6: In-memory is for fast front-end data access, query, mart, exploration. DW hub can leverage HDD for storage.
johncrupi I think the future for in-memory is #GPGPU using GPU memory and processing
BTRG_MikeMartin But only if having the DW in-memory makes business sense.
dfloyer Data warehouse provides "cubes" of formal data - easy to ensure data provenance - however just one source of "cubes"
cristianmolaro A6 in-memory processing has been for ages THE performance strategy of every database management system... bufferpools?
tomjkunkel How does DataBase Virtualization compare?
BigDataAlex A6: People need to save money in building and supporting their warehouse. IMC is one way to get there.
johncrupi In a year, will we still be talking about in-memory as a separate thing. Or will it just become in-memory analytics.
cristianmolaro A6 some data warehousing appliances take advantage of in-memory processing of data. An example is IBM IDAA
BigDataAlex A6: #Forbes is writing about In Memory Computing - paradigm shifting.
johncrupi All I want is virtualized memory where I can run all my real-time analytics. Is that so hard ;>
jameskobielus A6: In-mem in front-end means less roundtripping 2 DW in back-end. Save CPU & bandwidth
jeffreyfkelly A6 must balance biz value of better performance via in-memory versus cost as applied to DW workloads - all workloads really
cristianmolaro A6 often computer systems are CPU rich and Memory poor... in some cases adding more memory can be the best performance upgrade
jameskobielus A6: You can do both. Potentially, In-mem can be virtualized across server cluster. Memory pooling
jameskobielus A6: in a year we'll be discussing in-mem as key data persistence & execution option in a real-time analytic infra
cristianmolaro A6 a huge amount of memory is not necessarily a recipe for great performance: the system has to divide info to conquer