Category Archives: Linux

Home Backup Server Using Bacula

Published by:

I have several personal computers that store documents, pictures, music, and videos that I need to backup regularly. Google Drive, Google Picasa, and Google Play Music provide offsite storage for my most critical files, and while these cloud-based file copies are convenient, I do not completely trust the files to Google. What if the services are compromised or the files are corrupted?

The general recommendation is to have at least three copies of your important files and data: a primary online copy, a local backup, and an offsite backup.

I debated between the need for a local backup and researched several secondary cloud-based options including Amazon S3, Backblaze, and Crashplan. The catch with any cloud backup solution is bandwidth. A full backup or restore of a terabyte can take weeks, and it’s often faster to load the files onto an external hard drive and ship them to/from the cloud storage vendor.

After ruling out a cloud-based backup solution due to prohibitively long backup and restore times, I looked into a few different local backup solutions. Most of my experience has revolved around commercial solutions including Netbackup, CommVault, and EMC Networker. Given that the commercial solutions are overpriced and overly complicated for home backup needs, I looked into Amanda and Bacula as open source solutions. Either one works fine, and Amanda has a few more enterprise class options, but I decided to go with Bacula because it is included in the Ubuntu repositories.

After selecting Bacula as the backup software solution, I researched a few different small servers. The backup server needed to be able to support at least 3 disks and RAID5, and I ended up purchasing an HP ProLiant N40L MicroServer. The base server comes with 2GB of memory, a single 250GB hard drive, and a 4-port SATA RAID controller. I had two 1TB hard drives laying around, so I purchased two more for a total of 4TB of raw storage. The MicroServer has space for a 5.25″ optical drive, so I opted not to install an optical drive and instead purchased a 5.25″ to 3.5″ bay adapter to hold the included 250GB hard drive. Beyond the storage, I upgraded the memory to 2x4GB memory sticks and bought a remote access card for headless administration.

If you are interested in buying the same components, you can check them out on Amazon. 2TB hard drives are probably a better choice if you need the capacity. I just wanted to make use of the two 1TB drives I already had.

After the hardware arrived, I installed Ubuntu 12.04 LTS Server (64-bit) on the 250GB drive mounted in the optical bay. The OS drive is not redundant, but I figure the OS can be rebuilt on a new drive in a pinch. Then I installed the ZFS on Linux kernel module and configured the 4x1TB drives as a ZFS file system. Finally, I installed the Bacula server and configured all of the clients.

Overall, the MicroServer fits nicely in my home media center, and the Bacula software has been doing a good job of consistently backing up my files for the past couple of months.

2012-07-01 Public

HP ProLiant N40L MicroServer + Bacula Backup Server

Viewing images 1-3 of 3
P7012362.JPG P7012359.JPG P7012363.JPG
Viewing images 1-3 of 3

Ubuntu Linux + Sony BDP-S570 + DLNA

Published by:

I bought a Sony BDP-S570 3D Blu-ray Disc Player a couple of months ago, and I also happen to run a MythTV server where I store all of my music, videos, and recordings. The BDP-S570 says it is a DLNA client, but for whatever reason it does not recognize the MythTV UPnP/DLNA server.

This is what I did to share my MythTV media with the Sony BDP-S570:

  • Add the unofficial MiniDLNA Ubuntu PPA:
    sudo add-apt-repository ppa:stedy6/stedy-minidna
  • Update the APT package index:
    sudo apt-get update
  • Install MiniDLNA:
    sudo apt-get install minidlna
  • Edit the /etc/minidlna.conf file:
    sudo vi /etc/minidlna.confport=8200
    network_interface=eth0
    media_dir=A,/var/lib/mythtv/music
    media_dir=P,/var/lib/mythtv/pictures
    media_dir=V,/var/lib/mythtv/videos
    media_dir=V,/var/lib/mythtv/recordings
    friendly_name=MythTV DLNA Server
    album_art_names=Cover.jpg/cover.jpg/AlbumArtSmall.jpg/albumartsmall.jpg/AlbumArt.jpg/albumart.jpg/Album.jpg/album.jpg/Folder.jpg/folder.jpg/Thumb.jpg/thumb.jpg
    inotify=yes
    enable_tivo=no
    strict_dlna=no
    notify_interval=900
    serial=12345678
    model_number=1
  • Restart the MiniDLNA server, removing the existing media list:
    sudo /etc/init.d/minidlna stop
    sudo rm -r /tmp/minidlna
    sudo /etc/init.d/minidlna start
  • Turn on your Sony BDP-S570, and see if your media server is listed:
    Setup menu > Network Settings > Connection Server Settings
  • Scan for the media server if it is not already listed. It will have a status of “Shown” if it has been found.
  • Now try to play some of the media!

The instructions above will most likely work for other Sony BDP-S* players, too.

AD Patch Worker Hangs on XDOLoader Process

Published by:

Have you run an e-Business Suite R12 patch that slowed down or hung at the Java Loader steps for no apparent reason? I first encountered this issue in January, and finding a workable solution took several hours of research. No Oracle Support notes pointed directly to the issue at the time, although several more recent notes make the issue easier to identify and solve. Hopefully this post will be useful to someone else.

Platform: Red Hat Enterprise Linux Server
Application Version: e-Business Suite 12.1+

Symptoms:

Patch runs fine until it begins to slow down and hang partway through the java loader (e.g., XDOLoader) steps for no apparent reason. There are no indications that the hang is being caused by a database performance or locking issue.

Troubleshooting:

AD patch worker log error:

Error: Error connecting to database "jdbc:oracle:thin:APPS/xxxxxx@(DESCRIPTION=(LOAD_BALANCE=YES)(FAILOVER=YES)(ADDRESS_LIST=(ADDRESS=(PROTOCOL=tcp)(HOST=YOUR_HOST)(PORT=1521)))(CONNECT_DATA=(SID=YOUR_SID)))"
Io exception: Connection reset

Run jstack on the hanging java process:

"main" prio=10 tid=0x08937000 nid=0x22ea runnable [0xf73e1000]
java.lang.Thread.State: RUNNABLE
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:199)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
- locked <0xf29b25a0> (a java.io.BufferedInputStream)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:258)
at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
- locked <0xf29b2370> (a java.io.BufferedInputStream)
at sun.security.provider.SeedGenerator$URLSeedGenerator.getSeedByte(SeedGenerator.java:453)
at sun.security.provider.SeedGenerator.getSeedBytes(SeedGenerator.java:123)
at sun.security.provider.SeedGenerator.generateSeed(SeedGenerator.java:118)
at sun.security.provider.SecureRandom.engineGenerateSeed(SecureRandom.java:114)
at sun.security.provider.SecureRandom.engineNextBytes(SecureRandom.java:171)
- locked <0xf29b1fd0> (a sun.security.provider.SecureRandom)
at java.security.SecureRandom.nextBytes(SecureRandom.java:433)
- locked <0xf29b2250> (a java.security.SecureRandom)
at oracle.security.o5logon.O5LoginClientHelper.generateOAuthResponse(Unknown Source)
at oracle.jdbc.driver.T4CTTIoauthenticate.marshalOauth(T4CTTIoauthenticate.java:457)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:367)
at oracle.jdbc.driver.PhysicalConnection.(PhysicalConnection.java:510)
at oracle.jdbc.driver.T4CConnection.(T4CConnection.java:203)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:33)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:510)
at java.sql.DriverManager.getConnection(DriverManager.java:582)
at java.sql.DriverManager.getConnection(DriverManager.java:185)
at oracle.apps.xdo.oa.util.XDOLoader.initAppsContext(XDOLoader.java:558)
at oracle.apps.xdo.oa.util.XDOLoader.init(XDOLoader.java:455)
at oracle.apps.xdo.oa.util.XDOLoader.(XDOLoader.java:413)
at oracle.apps.xdo.oa.util.XDOLoader.main(XDOLoader.java:2250)

Check /dev/random entropy:

cat /proc/sys/kernel/random/entropy_avail
NOTE: Higher numbers are better. The patch will begin to slow down or hang whenever entropy is ~50 or less.

Explanation:

The java process depends on the /dev/random device to provide random numbers to the SecureRandom Java class. If /dev/random runs out of random numbers, the patch workers calling SecureRandom hang until enough random numbers are available.

Solutions:
NOTE: Pick one of the solutions below. Solution number 1 is my preferred solution, since it is specific to the e-Business Suite and should not affect other processes on the server.

  1. Search for all jre/lib/security/java.security files and replace:

    securerandom.source=file:/dev/random
    with
    securerandom.source=file:/dev/urandom

  2. Run the rngd daemon to seed /dev/random with random numbers:
    Install the rngd-utils package in RedHat 5 or kernel-utils in RedHat 4.
    rngd -r /dev/urandom -o /dev/random -f -t 1
  3. Replace the /dev/random device with /dev/urandom. (Not recommended for security reasons.)

    sudo mv /dev/random /dev/random.bak
    sudo ln -s /dev/urandom /dev/random

References:

New Web Hosting Provider

Published by:

I recently switched from a web hosting plan with IX Web Hosting to a Virtual Private Server (VPS) plan with Rose Hosting.

I had the “Business Plus” plan with IX Web Hosting for 2 years. The service stability was always a little spotty, but the price-benefit ratio was acceptable for most of my time with them. My plan was set to expire, and the site’s performance and stability were becoming noticeably worse, so I decided to check out other options.

Beyond the degradation of stability and performance on IX Web Hosting, I found that I had trouble accessing files whenever they were generated by the web server processes. My user owned the directory structure, but backups and file uploads were owned by the web server user. Due to this, I had issues with deleting files and/or changing file permissions. To work around the issue, I had to write a PHP script that would execute as the web server owner to delete files.

Considering the limitations of a web hosting provider, I decided I would rather have full control over the services by having a dedicated or virtual private server. Because this site is not exactly “critical” to anyone, I concentrated my search based on price rather than uptime. I ruled out a dedicated server based on the higher cost, leaving me with Linux virtual private servers.

The main open source virtual environments offered on Linux at this time are OpenVZ and Xen. OpenVZ and Xen are different but not different enough for me to pick one over the other. I continued my search primarily on provider reviews and price points. The Debian Wiki site offered a list of Linux VPS hosting providers, and I started looking at the plans that several of the providers offered. I was leaning toward using VPSLink when I came across the Rose Hosting virtual server specials. The prices seemed too good to be true, but after searching for reviews, the provider seemed legitimate. They may not be as big or as stable as some of the other providers, but the price is right for a blog like mine.

I ordered the Rose Hosting service late night on a weekend, and I did not receive an email reply providing my connection information. I contacted the provider’s support personnel via email and a chat window. It turned out that their email had been marked as spam by Gmail. After pulling it out of my spam folder, things have been going well.

I moved my MySQL database and website over to the Rose Hosting server and reconfigured my DNS entry. The best thing is that I can now monitor the server’s performance and uptime using all the basic Linux utilities. As of this time, the site has gone down once for 30 minutes as a result of a server outage. I did not inquire with support as to the cause, but the site has been stable otherwise.

If anyone is interested in having a cheap virtual private server for development or fun, I recommend Rose Hosting. I have only been with them for a month, so if my recommendation changes, I will post an update.

Amarok Variable Bit Rate (VBR) Track Time

Published by:

I just recently installed Amarok on my Ubuntu desktop at home. It has been years since I have used anything other than XMMS to play my mp3 collection. Let me tell you…Amarok is probably the best mp3 player I have used including anything I have used on Windows. (I do not have a Mac, so I can’t vouch for any music players beyond iTunes.)

One of the issues I had after installing is that my variable bit rate (VBR) mp3 track lengths were all wrong. After reading a few HOWTO’s and forum posts, I collected enough information to fix the issue by writing the track length in the mp3 track tag.

  1. Download and install vbrfix.
    sudo apt-get install vbrfix
  2. Run vbrfix against all your mp3 files.
    find /myth/music -type f -name "*.mp3" -exec vbrfix {} {} \;
  3. Start Amarok.
  4. Rescan your music collection (assuming the songs had been added to the library).
    Tools > Rescan Collection
  5. Restart Amarok.
  6. The track times should be accurate.

I am still working out how to control my Alsa PCM channel via Amarok and LIRC. If I have time to find a solution, I will try to post it.