Kit Menke is a Data Engineer / Software Architect from St. Louis, MO USA. He works mostly with open source Big Data technologies like Apache Hadoop, Apache Storm, Apache HBase, Apache Hive, and Apache Spark.
He is a member of the St. Louis Hadoop User group and has spoken at multiple conferences (DataWorks Summit and StampedeCon).
When working in a corporate environment, you’ll often have to deal with self-signed certificates that are used to secure internal dev tools like Artifactory or a git server.
I needed to be able to the run Spark’s shell using a different home directory due to a permissions issue with my normal HDFS home directory. A little known feature with spark is that it allows overriding Hadoop configuration properties!
Running Apache Storm in production requires increasing the nofile and nproc default ulimits.
I recently ran into an issue submitting Spark applications to a HDInsight cluster. The job would run fine until it attempted to use files in blob storage and then blow up with an exception:
java.lang.ClassCastException: org.apache.xerces.parsers.XIncludeAwareParserConfiguration cannot be cast to org.apache.xerces.xni.parser.XMLParserConfiguration.