HP Procurve and Cisco 3750 Configuration Notes

In my original configuration, I set up a trunk of 3 vlans between my cisco 3750 and HP procurve switches.Vlan 1, 7 and 9  have been working fine.  I want to extend it to include vlan 13, so I made the following changes to allow vlan 13 to go through the trunk.Cisco : - switchport trunk allowed vlan 1,7,9,13 (added 13)HP: -  vlan 13  tagged 24=============================================interface GigabitEthernet1/0/24 switchport trunk encapsulation dot1q switchpor

Adding Permission to your Java home directory

Adding Permissions to Java in your home directory.

<applet codebase="../jar/" ....

Your .java.policy file in your home directory would have an entry looking something like this:-

grant codeBase "file:///Users/ben/path-to-project/build/Debug/project-name.jar" {
permission java.security.AllPermission;

keytool -importcert -keystore "...\java\ jdk1.6.0_17\jre\lib\security\cacerts" -storepass changeit -alias agrozoocert2011dsa -file agrozoocert2011dsa.cer

Hadoop Cluster - Bookmarks and help URLs

1.5. I have a new node I want to add to a running Hadoop cluster; how do I start services on just one node?

This also applies to the case where a machine has crashed and rebooted, etc, and you need to get it to rejoin the cluster. You do not need to shutdown and/or restart the entire cluster in this case.

First, add the new node's DNS name to the conf/slaves file on the master node.

Fine tuning Apache Hadoop Security Settings

Apache Hadoop is equipped with a robust and scalable security infrastructure. These notes are intended to cluster administrators fine-tune the security settings of their clusters.

Quality of Protection:

Security infrastructure for Hadoop RPC uses Java SASL APIs. Quality of Protection (QOP) settings can be used to enable encryption for Hadoop RPC protocols.

Java SASL provides following QOP settings:

Best Practices Selecting Hadoop Hardware

Excerpts of this article are copyright there respective owners: The original article can be found at : Hortonworks

Apache Hadoop worker node hardware @ Yahoo!, a lot of nodes with 6*2TB SATA drives, 24GB RAM and 8 cores in a dual socket configuration. This has proven to be a pretty good configuration. This year, I’ve seen systems with 12*2TB SATA drives, 48GB RAM and 8 cores in a dual socket configurations. We will see a move to 3TB drives this year.


Subscribe to kb.kaminskiengineering.com RSS