Friday, July 22, 2016

Configuring Bacula to backup an Apple Mac to a remote backup server

Scenario:
A set of   apple MacOS desktop machine running El Capitan need to be backed up to a remote Bacula server. This post describes the steps I have taken to get the setup working.










Environment:
Backup Server: Scientific Linux 7
Bacula director: Bacula 7.0.5  Upgraded to 7.4.3
Client Machines: MacOSX El Capitan
Bacula Client : bacula-sd 7.4.3

Step 1.
Install Homebrew package manager for Mac.

Step 2.
Install bacula-fd - Bacula's File Daemon
You only need to have bacula file daemon installed on client machines.
brew install bacula-fd

Make bacula-fd daemon start at  startup.
sudo brew services start bacula-fd

Step 3.
Upgrade Bacula on the server.
RedHat repositories only have Bacula 7.0.5 binaries at the time of this post. Since the client installed by homebrew is  version 7.4.3, I believe its a good idea to upgrade the director and the storage daemon to an equal or  higher version than the client.

a. Backup your bacula configuration files on the server /etc/bacula directory.
b. Backup bacula postgres or mysql SQL database
c. Download latest Bacula source.
d. Run ./configure -with-postgresql . Other configuration options are listed here.
e. Run make and then make install
f.  Run database update script
g Copy existing configuration conf files over new ones.
h.Run  chkconfig bacula-dir on ,  kconfig bacula-sd on and chkconfig bacula-fd on to make them start at start-up.

Step 4.
Configuring bacula-fd on clients.
Machines running bacula-dir , bacula-sd and bacula-fd must be able to resolve each other through DNS or host records.

 Log file for bacula-fd in client machines is located in:
/usr/local/var/log/bacula 
This log file not be used until you specifically include it in the conf file.

Sample config file for bacula-fd is shown below. Note the storage {} clause  included in it.


Step 5.
Configure clients,  file lists , backup jobs and schedules on bacula Director.
You can manually edit conf files or use webmin to configure this part.

Some error messages you may come across:
1. Fatal error: Unable to authenticate with File daemon at "myserver.ip.address:9102". Possible causes:   Passwords or names not the same or  Maximum Concurrent Jobs exceeded on the FD or  
FD networking messed up (restart daemon)

This means that your bacula director cannot talk to bacula file daemon. Follow these steps to resolve it.
a. Disable or add exceptions to bacula-fd port (TCP 9102) on client and server firewalls.
b. Makes sure that clients and server can resolve each other.
c. Make sure that the passwords are right.
d. Restart the client mac. Any port binding issues will get resolved from that.

2. Warning: Cannot bind port 9102: ERR=Address already in use: Retrying ...
You may see this error on the bacula-fd  client log.  Restarting client machine will resolve it.

3. Warning: bsock.c:107 Could not connect to Storage daemon on localhost:9103. ERR=Connection refused
a. You need to include the storage {} clause in your bacula-fd.conf file to redirect it to the remote storage server. Otherwise it will search for a bacula storage daemon on the local host.
b. Open up TCP port 9103 on client and server firewalls.

Saturday, May 7, 2016

qsub-Bad UID for job execution MSG=User does not exist in server password file

Scenario:
A High Performance Compute cluster running scientific Linux. Head nodes and  compute nodes are members of a Windows 2012 R2 Active directory domain. Users log in to client nodes  using their AD login and then SSH (Kerberos enabled passwordless login) in to head node to submit PBS job scripts to the cluster.  

knit and id commands work on the head node and display relevant information for the domainuser. Which means  that the head node can connect and resolve domain usernames from the Active Directory.

Issue:
When submitting PBS jobs they see the following error :
[domainuser@myorg.org.au@HPC torque]$ qsub FirstJob.pbs 
qsub: submit error (Bad UID for job execution MSG=User domainuser does not exist in server password file


when submitting the job , Torque call the system function : getpwnam_r to grab the user information who is submitting the job. The error in here is misleading as it sounds like getpwnam_r  is only looking for the user in the "server password file". But according to the man page , when configured , it also search in NIS and LDAP  for the given user.

“ The getpwnam() function returns a pointer to a structure containing the broken-out fields of the record in the password database (e.g., the local password file /etc/passwd, NIS, and LDAP) that matches the username name. 
The getpwuid() function returns a pointer to a structure containing the broken-out fields of the record in the password database that matches the user ID uid

Reason:
What causes this error is that getpwnam_r is looking for the user  domainuser@myorg.org.au instead of domainuser in authentication databases. In this case in PASSWD file as well as in the Active Directory.

Fix:
Go to /etc/sssd/sssd.conf file and change the option :  use_fully_qualified_names to False. So that SSS will only look for the username in Active Directory.

Torque Make error: mom_mach.h: No such file or directory

Issue: 
While running MAKE to compile  torque resource manager you may come up with the error:
site_mom_chu.c:25:22: fatal error: mom_mach.h: No such file or directory
#include "mom_mach.h"


Environment: 
OS: Scientific Linux 7.2
TORQUE Resource Manager :  6.0.1

Fix: 
To fix this issue , simply give full file access to all the files in the torque folder.

chmod 777 -R *

Then attempt the MAKE process again.

Only other available web resource regarding this issue is in here:

Monday, April 11, 2016

Configuring XNAT to use Active Directory LDAP Authentication

Intro:
XNAT is an open source imaging informatics platform developed by the Neuroinformatics Research Group at Washington University. XNAT was originally developed in the Buckner Lab at Washington University, now at Harvard University. It facilitates common management, productivity, and quality assurance tasks for imaging and associated data. Thanks to its extensibility, XNAT can be used to support a wide range of imaging-based projects.

Tested with following versons:
OS : Scientific Linux 7.2
XNAT version: 1.6.5
Java Version:  1.7.0_79
AD : Windows Server 2012 R2

Let's assume that, 

  • Your organisation's active directory domain is : myorg.com.au
  • All your users are located in People OU in the root of the domain
  • The directory server DNS name is : dc01.myorg.com.au
  • The ldap service account to access and read domain information is located in : myorg.com.au/People/Service Accounts
  • The service account to access the directory is : srvldap and  Passoword is : password

Official documentation on how to configure XNAT for LDAP authentication is located here.
Services.Properties Configuration - XNAT 1.6.x Documentation - XNAT Documentation Wiki

The purpose of this post is to provide you with accurate configuration options to make XNAT work with Active Directory. 

This is how a working configuration should looks like in XNAT  /apache-tomcat-7.0.68/webapps/xnat/WEB-INF/conf/services.properties file. (Note that the path will be different in your implementation) 

############# services.properties  ############# 
# Comma-separated list of the providers that users will be able to use to authenticate.
provider.providers.enabled=db,ldap1

provider.db.name=LOCAL
provider.db.id=localdb
provider.db.type=db

# Add "ldap1" to the enabled provider list above and fill in the missing fields to enable LDAP authentication.
provider.ldap1.name=MYORG
provider.ldap1.id=ldap1
provider.ldap1.type=ldap
provider.ldap1.address=ldap://dc01.myorg.com.au:389/dc=myorg,dc=com,dc=au
provider.ldap1.userdn=myorg.com.au/People/Service Accounts/srvldap
provider.ldap1.password=password
provider.ldap1.search.base=ou=People
provider.ldap1.search.filter=(sAMAccountName={0})

############ END services.properties  ###########

Note that ,
1. On the provider.ldap1.address field,  I have used:dc=myorg,dc=com,dc=au instead of using recommended dc=au,dc=com,dc=myorg. This order is important. Other wise you will get the following error in your XNAT security.log file.

"Authentication request failed: org.springframework.security.authentication.BadCredentialsException: Bad credentials"

2. Canonical name for provider.ldap1.userdn field instead of DN

Some helpful tips:
1. To enable debugging in XNAT security log , change flags shown below in the log4j.properties file.
This file is located in  /apache-tomcat-7.0.68/webapps/xnat/WEB-INF/conf/ folder

Change flags from WARN to DEBUG 

# Security logs, both Spring Framework and XNAT
log4j.category.org.springframework.security=DEBUG, security
log4j.additivity.org.springframework.security=false
log4j.category.org.nrg.xnat.security=DEBUG, security
log4j.additivity.org.nrg.xnat.security=false

2. Use JXplorer to test the connectivity to the Active Directory. Using a Java based tool like JXplorer will help you to troubleshoot the issues better in this type of scenarios as XNAT is also based on Java.

A helpful reference: XNAT 1.6.3 LDAP Error - Google Groups 

Wednesday, July 8, 2015

Handling Acrobat fillable forms

Acrobat : 10.1.14
Acrobat Reader:  11.0.11
Server Side Processing : PHP

Recently I was creating a PDF fillable form in Acrobat Pro and wanted to find an easier method to receive the filled contents. Easiest method seems to be through an email. I was looking for a solution bit better than JavaScript mailto: method, which opens the default email client and populate the mail with filled information.

The form I was working with , did in fact used JavaScript mailto: method. But when I tried to edit javascript to insert some additional code, it always complain with an error:

SyntaxError: unterminated string literal


These are some tips for fixing it. But nothing worked for me.


According to few sources , this seems to be a bug in the JavaScript processing engine in Acrobat. Since none of the "tips" worked in my case , I tried using a different method to process the form.

1. Creating Forms

I’m not going to describe how I created the form in here. Lynda.com has a great tutorial on creating Acrobat Forms. Have a look (if you have access) and you'll find it really informative and helpful.

2. Submit Button Properties

I have used xfdf format to send data back as it uses plain text xml. Easy to debug and work with.



3. Form submission PHP script - submitForms.php

<?php
 //Send header of type - Adobe FDF
 header ("Content-Type: application/vnd.fdf"); 

 /* Get submitted Contents - Could not use  $_POST variable as the url rewrite is active in htaccess. In that case $_POST variable will be always empty */
 $content = file_get_contents("php://input");

//-----------------------------------------
/* Code from : http://blog.rubypdf.com/2008/12/22/parsing-xfdf-in-php/
 Modified to use a variable instead  a file */


    /* BEGIN VARIABLE DECLARATIONS */
    //global variables for XML parsing
    $values = array();
    $field = "";
    $curTag = "";
     
    /* BEGIN XML PROCESSING */
    // XML Parser element start function
    function startElement($parser, $name, $attrs)
    {
        global $curTag, $field;
        
        //track the tag we're currently in
        $curTag .= "^$name";
        
        if( $curTag == "^XFDF^FIELDS^FIELD" )
        {
            //save the name of the field in a global var
            $field = $attrs['NAME'];
        }
    }
     
    // XML Parser element end function
    function endElement($parser, $name)
    {
        global $curTag;
        // remove the tag we're ending from the "tag tracker"
        $caret_pos = strrpos($curTag,'^');
        $curTag = substr($curTag, 0, $caret_pos);
    }
     
    // XML Parser characterData function
    function characterData( $parser, $data )
    {
        global $curTag, $values, $field;
        
        $valueTag = "^XFDF^FIELDS^FIELD^VALUE";
        
        if( $curTag == $valueTag )
        {
            // we're in the value tag, 
            //so put the value in the array
            $values[$field] = $data;
        }
    }
     
    // Create the parser and parse the file
    $xml_parser = xml_parser_create();
    xml_set_element_handler($xml_parser,  "startElement", "endElement");
    xml_set_character_data_handler($xml_parser, "characterData");
     

        if (!xml_parse($xml_parser, $content ))
        {
            die(sprintf("XML error: %s at line %d", 
            xml_error_string(xml_get_error_code($xml_parser)), 
            xml_get_current_line_number($xml_parser)));
        }
    
     
    xml_parser_free($xml_parser);
    fclose($fp);
    /* END XML PROCESSING */ 
    
//-----------------------------------------
// Compose Message with form data $message = '<html><body>'; foreach ($values as $key => $value) { $message .= "<b>$key</b> : $value <br><br>"; } $message .= '</body></html>'; // Mail Message $to = "tiraj@XXXXX.onmicrosoft.com"; $subject = "Survey Form"; $header = "From:donotreply@XXXXX.org.au \r\n"; $header .= "MIME-Version: 1.0\r\n"; $header .= "Content-Type: text/html; charset=ISO-8859-1\r\n"; $retval = mail ($to,$subject,$message,$header); if( $retval == true ) { /* FDF response taken from : http://leeboyge.blogspot.com.au/2011/07/sending-fdf-response-back-to-pdf.html */ echo "%FDF-1.2 \n 1 0 obj \n << \n /FDF \n << \n /Status (Your form has been successfully submitted.) >> \n >> \n >> \n endobj \n trailer \n << \n /Root 1 0 R \n >> \n %%EOF"; } else { echo "%FDF-1.2 \n 1 0 obj \n << \n /FDF \n << \n /Status (Form Submission failed) >> \n >> \n >> \n endobj \n trailer \n << \n /Root 1 0 R \n >> \n %%EOF"; } ?>

4. Receiving mails in Office 365

Mails sent from a PHP script on any reputable server will reach gmail or any other typical mail service in seconds. But with Office 365 , these mails does not get delivered to user’s inbox unless you do some additional tweaking.

1. White list the IP address.
On Exchange ECP go to Protection , then to  Connection Filter tab and insert the server IP in to IP Allow List

2. Add the domain or server email address to safe sender list.
On Exchange ECP go to Protection , then to Spam Filter tab and insert the sender and sender domain to Allow Lists.

3. Use onmicrosoft.com email address instead of your corporate address.
Your Office 365 Business account always has a default email address which has the domain name part similar to : @yourorg.onmicrosoft.com. You MUST use this email address in your scrip to receive emails.

5. Return value from PHP script

This cannot be text. You have to set the content type of the header to be Acrobat Form Data Format (FDF). If your script return anything apart from FDF headers , Acrobat reader will complain with :

An error occurred during the submit process. Can not process content type text/html


Once you specify the correct return type in:  header ("Content-Type: application/vnd.fdf");  and construct the FDF return data properly , Acrobat Reader will happily accept it. Please note that its important to place the \n characters on right places as they are part of the FDF definition.


References:

Wednesday, March 19, 2014

Fix For: Xerox Printers taking long time to print PDF documents


Scenario:
Certain PDFs that are filled with graphics takes long time to print. Once the document is sent to the printer, it can take up to 5 minutes to print the first page. Then the printer will wait for another 3 minutes and spit out another page and this wait and print cycle goes on.  

These files are not large (about 200Kb-300Kb). When you sent them to the printer, in the print queue you can see that the file has bloated in to a 30-50 MB file.

Any other file formats get printed with in few seconds.

Environment:
-Xerox apeosport or Xerox docucentre printers
-Windows 7 64bit
-Windows Server 2008 R2 Print server with latest PCL and PostScript print drivers

Troubleshooting:
1. Removed and installed all print drivers on print server and on client machines. 
2. Tested with PCL and PS drivers
3. Disabled Bi-Directional support.
4. Checked/disabled firewall/Antivirus 
5. Attempted these steps suggested by Tom Zhang

None of these fixes worked.

Fix:
So then what finally worked?

Printer's network card was set to 100Mbps Full-Duplex. Setting this to Auto set the communication between the printer and the print server to an acceptable speed. What took 5 minutes to print got printed with in 10 seconds after that.


Thursday, March 13, 2014

Fixes for: Delay in receiving Emails with large (3MB-15MB) attachments


Scenario:

Since we implemented our new exchange infrastructure, users were having trouble receiving emails with large attachments from external senders.  Puzzling factor was that delayed emails eventually flow through and the delay time seems to be completely random ranging from 1 day to 4 days for each message. Meantime senders were getting Delivery is delayed to these recipients or groups: notifications like the one below.



Environment:

- Windows Server 2012 R2
- Exchange 2013 with CU3
- Sophos Puremessage Spam filtering/AV on Mailbox role servers
- Cisco infrastructure

Troubleshooting steps:

Header analysis:

When these email eventually came through, header analysis revealed that the bottleneck was between the last Internet email relay and the organization’s edge Client Access server.

Two good online header analyzer tools can be found here
2. Microsoft Remote Connectivity Analyzer (Message Header Analyzer)


If you notice you will see that , Hop 8 has taken 3 days to complete.  Internal emails and the emails sent to outside parties get delivered immediately.

Exchange Logs:

I enabled Exchange logging to see what’s causing this bottleneck.  What I wanted to check was the SMTP transactions between the external hosts and the organization’s edge CAS server.  For this you need to analyse Exchange Protocol logs.

Exchange LogViewer is a simple, good and free tool to analyse these logs.


Notice that the external relay makes few delivery attempts but only one out of many pass through. And also notice the SMTP 451 4.7.0 Timeout waiting for client input message that get logged before the   failed delivery session times out.

So what is causing these timeouts?  These are the areas  (I thought) need to be checked:

1.  Time out setting in the Exchange receive connector
2.  Sophos Puremessage/AV scan hindering the email flow
3.  Packet MTU size settings in the Client access servers and the edge router

 You can check the timeout settings for the Exchange receive connectors by using Get-ReceiveConnector cmdlet on the Exchange powrshell and change it to any desired value by using 

Set-ReceiveConnector -Identity "host\Internet Receive Connector" -ConnectionTimeout 00:15:00

Make sure you restart the Exchange Transport service or restart the server for this change to take effect.

In my case I changed this from default 10 minutes to 20 minutes.  It didn’t fix the problem.

Most people, in response to this type of time outs, suggest that it can be caused by a spam filtering/AV solution in the server. In my case I disabled all AV/Spam filtering services installed in the CAS and Mailbox servers and restarted the servers.  No luck. Emails were still getting delayed.

In few places , people have suggested that the  MTU packet size issues can cause  mails with large attachments to fail.  See this Microsoft article  - How to Troubleshoot Black Hole RouterIssues  for more details and the steps to troubleshooting MTU related issues. 

This can be a real issue if your firewall is blocking incoming/outgoing ICMP traffic.  Nodes along the way   need to do Path MTU discovery using ICMP protocol and blocking that will hinder this discovery process.
 
So, make sure that you can ping the edge CAS server using large packets. Apart from that you don’t have to muck around with MTU settings in the routers or windows hosts as some people are suggesting.



So!,  what ultimately was causing this email delay?

In the end, on one forum someone suggest that any type of SMTP analysis/filtering done on a firewall can cause SMTP timeouts.

“Is there a firewall? I'm suspicious that you have a firewall that's interfering with the SMTP traffic.”

 Analyzing the zone based Cisco IOS firewall on the edge router revealed that it is configured to  “Inspect” SMTP traffic from outside to the CAS server in DMZ.  

Instead of inspecting  SMTP traffic specifically  I’ve change it to TCP/UDP traffic inspection and suddenly all mails with large attachments trying to get through  since days started pouring in.