Categories
#DEV

Easy Script to Test your CRON Job

Developers would know this…often times, we setup CRON Jobs to run on specific times to execute a specific file or perform a task but we really don’t know if it’s running as it should.

Now I know many developers choose to have a script that probably logs its execution on a log file somewhere or output it to a TXT file. This is one alternative but I found this little PHP script online on InkPlant. It quite useful because it does the same thing (outputting execution of the main file to a TXT document).

If you are adding a job through crontab, here’s a good way to know if it’s actually running:

1) Create a blank text document named cron_test.txt and upload it into your script’s folder. Change the CHMOD settings or permissions on it to 777 so that its writable by the server.

2) Create a new PHP file called “cron_test.php”. This script will basically execute a task of writing a line of details to cron_test.txt you created in Step 1.

<?php
$crontext = "Cron Run at ".date("r")." by ".$_SERVER['USER']."\n" ;
$folder = substr($_SERVER['SCRIPT_FILENAME'],0,strrpos($_SERVER['SCRIPT_FILENAME'],"/")+1);
$filename = $folder."cron_test.txt" ;
$fp = fopen($filename,"a") or die("Open error!");
fwrite($fp, $crontext) or die("Write error!");
fclose($fp);
echo "Wrote to ".$filename."\n\n" ;
?>

3) Enter the following line to your Crontab or CPanel if you’re using one.

* * * * * php /yourfolder/cron_test.php

4) You should be able to see all the execution data on “cron_test.txt”.

Hope you found that useful. If you have a neater solution to test CRON jobs, please do share. I would love to know about it.

Categories
#DEV Tutorials Useful & Productive

Magento Slow Backend but A Fast Frontend

Past two days has been a nightmare. We recently migrated all of our websites to Amazon Web Services (AWS), and the speed has been good. We love it. The infrastructure is excellent and so is the service we’re getting. I wouldn’t have a lot to say about their support, though. Unless you are a reasonably big enterprise which is spending a lot of dollars, you can’t afford their support packages. What I suggest from my personal experience is to subscribe to their developer support. If you get into issues relating to operating websites on their servers, they usually point you in the right direction. You will get a response generally within 24 hours which is ok.

The reason why I am writing this post is not to address that. It’s actually due to our experience with Magento. Over the past two days, I have learned so much about Magento E-Commerce Platform. One of our client who runs one of the biggest online pharmacies in New Zealand – YourChemist.co.nz hosts with us. The database is big, and so are the files. Migrating to AWS took a while, but we got there eventually. Since this website was so busy all through the day, the only time we could migrate had to be at midnight when it has the least amount of site traffic.

After migrating, we started noticing a significant problem. The speed of Magento’s backend or as some would address it as admin panel was terrible. So I did my little research on tackling this issue.

Categories
#DEV

Linux/Unix Commands for Unzip GZ Files

GZ is no doubt one of the best choices for database compression. We use it quite often when importing and exporting files. In fact, we recently used it for migrating our MySQL databases to Amazon RDS.

To extract a GZ file, use Gunzip command:

gunzip file.gz

If that had no success, try this:

gzip -d file.gz

To check if new extracted file exists, enter the following:

ls -l
Categories
#DEV

Import MySQL Database via SSH

We have been migrating a lot of our MySQL databases across to Amazon RDS due to our extensive use of AWS services. In the process of doing this, we realized the traditional PhpMyAdmin UI doesn’t do the job anymore.

It will timeout, and we need to keep uploading the same SQL multiple times to get it to import bit by bit partially. It can become quite a handful and somewhat frustrating especially when your database is more than 100MB or so in size.

An alternative way is to import through good ole SSH. This is how we do it. We export the database SQL file. Then, we login to our remote database (Amazon RDS) and import it to that remote DB instance. These commands are not suitable for importing to localhost MySQL.

1) Login to your MySQL with root user by using the following command.

mysql -h main.xxxxxxxxxxx.us-east-1.rds.amazonaws.com -P 3306 -u YOURUSERNAME -p

2) Once you are in, you can select the database you want to import into by entering the following:

USE DATABASENAME;

3) You should get a successful database selected message. Then enter the following to import desired SQL file

SOURCE FILENAME.sql

4) You should get a whole load of queries being executed. After a while, it should stop and hopefully you have a fully imported database ready to be used.

Happy Migrating!