April 23rd, 2007

NonBlocking HashTable Source Code

RSS icon RSS Category: Personal
Fallback Featured Image

I am please to announce, after loooong delay, the source code to my NonBlocking Hash Table is available as open-source on SourceForge:
http://sourceforge.net/projects/high-scale-lib
I’ll be adding to this library over time when I’m not so busy!  Right now I’m porting Java6 to Azul, reviewing OOPSLA papers (19 papers, about 15 pages each, mostly thick academic stuff), and making JavaOne slides.  In fact, I’ll be taking about the NonBlocking Hash Table at JavaOne (slides) this year, along with a couple of other talks.
Here’s an interesting (to me) discussion about tiered compilation in HotSpot.  Interesting to me because at Azul we’ve basically forked from the main HotSpot source base on this issue.  Our version of tiered compilation is alive and well, with a tidy performance win pretty much across the board.  Of course, we don’t inline true virtual calls (i.e., megamorphic calls or calls which actually target more than one method at runtime – as opposed to those that can be statically predicted), because our hardware lets us do such calls fairly cheaply.  Inlining the megamorphic call “nails down” the decision to do the Java-level call via the most expensive mechanism (albeit with compiler aided scheduling, which will help Sparc & X86 but not Azul), and nails it down at “server” compile time.
Since Azul’s tiered compilation is not nailing down the decision to do such calls “the hard way”, if it turns out the call site is really monomorphic we get to do the call via an inline-cache, i.e., really cheap.
Cliff
PS: I struck out on Wikipedia today, failing to find entries for: megamorphic calls, inline caches, tiered JIT compilation (and several variations on that theme), as well as entries for IBM’s J9 JVM (which I know has tiered compilation).  How many readers of this blog know what an inline-cache is?  (hint: it’s a way to make >95% of virtual Java calls go nearly as fast as plain static calls).

Leave a Reply

Time Series Forecasting Best Practices

Earlier this year, my colleague Vishal Sharma gave a talk about time series forecasting best

October 15, 2021 - by Jo-Fai Chow
Improving NLP Model Performance with Context-Aware Feature Extraction

I would like to share with you a simple yet very effective trick to improve

October 8, 2021 - by Jo-Fai Chow
Feature Transformation with the H2O AI Hybrid Cloud

It is well known throughout the data science community that data preparation, pre-processing, and feature

October 7, 2021 - by Benjamin Cox
Introducing DatatableTon – Python Datatable Tutorials & Exercises

Datatable is a python library for manipulating tabular data. It supports out-of-memory datasets, multi-threaded data

September 20, 2021 - by Rohan Rao
H2O Release 3.34 (Zizler)

There’s a new major release of H2O, and it’s packed with new features and fixes!

September 15, 2021 - by Michal Kurka
From the game of Go to Kaggle: The story of a Kaggle Grandmaster from Taiwan

In conversation with Kunhao Yeh: A Data Scientist and Kaggle Grandmaster In these series of interviews,

September 13, 2021 - by Parul Pandey

Start your 14-day free trial today