Can Changing a SQL Table Column From VARCHAR2 to CLOB Negatively impact systems that use cron scripts or PHP web pages?

I’m working in a complex system that utilizes multiple SQL queries called by several script files, printing the data onto php pages.

I will be changing a column from type VARCHAR2(4000) to CLOB, so that the field can expand past 4000 in size. (In Oracle)

I’ve been trying to understand as much of the code as possible to discern if changing VARCHAR2 to CLOB could have any negative or unanticipated side effects. There appears to be no down side to swapping to CLOB from VARCHAR2.

Any opinions or confirmation in this matter?

Source: stackoverflow-php

Cron job doesn’t send email

I have a cron job set up on my hostgator server that returns a value of 1 (email sent), but doesn’t actually send the email. When I call the php file manually in the browser, the email gets sent. It doesn’t if run via cron.

Earlier this month, I moved my site from one server on hostgator to another server (on hostgator) so I could get SSL and a dedicated IP address. Since moving the site, the cron jobs work OK except for the part that sends the email (ie, database functions work fine). I’ve contacted hostgator tech support, but he thinks the problem is in my code.

Thinking that maybe my server info was incorrect, I switched to using and using a gmail account to send the mail, but that didn’t work either. Please help!

The cron job is set up like this:

* 7 * * * /opt/php56/bin/php /home/user/public_html/somefolder/sendmailcron.php

(While testing, I changed it to run every 2 minutes: */2 * * * * )

Here’s the sendmailcron.php script:


$now = date("Y-m-d H:i:s");

$msgcontent = [];
$msgcontent['email'] = "";
$msgcontent['name'] = "Recipient Name";
$msgcontent['textpart'] = "This is the text version of the email body.";
$msgcontent['htmlpart'] = "<p>This is the <strong>HTML</strong> version of the email body.</p>";
$msgcontent['subject'] = "Test email sent at " . $now;

$result = sendMyMail($msgcontent, "HTML");

function sendMyMail($msgcontent, $format="HTML") {
    require_once '/home/user/public_html/somefolder/swiftmailer/lib/swift_required.php';

    $result = 0;
    $subject = $msgcontent['subject'];
    $email   = $msgcontent['email'];
    if (strlen($email) == 0) {
        return 0;
    $name = $msgcontent['name'];

    $emailbody = $msgcontent['textpart'];
    $emailpart = $msgcontent['htmlpart'];

    switch($format) {
        case "TEXT":
            $msgformat = 'text/plain';
        case "HTML":
            $msgformat = 'text/html; charset=UTF-8';
            $msgformat = 'text/html';

    $adminemailaddress = "";
    $adminemailpwd     = 'myadminpwd';
    $sendername        = 'My Real Name';

    $transport = Swift_SmtpTransport::newInstance('', 465, "ssl")

    $mailer = Swift_Mailer::newInstance($transport);

    // Create the message
    if($format == "TEXT") {
        $message = Swift_Message::newInstance($subject)
            ->setFrom(array($adminemailaddress => $sendername))
            ->setTo(array($email => $name))

    else {
        $message = Swift_Message::newInstance($subject)
            ->setFrom(array($adminemailaddress => $sendername))
            ->setTo(array($email => $name))

    //This is where we send the email
    try {
        $result = $mailer->send($message); //returns the number of messages sent
    } catch(Swift_TransportException $e){
        $response = $e->getMessage() ;
    return $result; //will be 1 if success or 0 if fail


The return value is always 1, but no email is sent.

Any suggestions would be greatly appreciated!

Source: stackoverflow-php

Scaling curl with php and cron

I am trying to create a website monitoring webapp using PHP. At the minute I’m using curl to collect headers from different websites and update a MySQL database when a website’s status changes (e.g. if a site that was ‘up’ goes ‘down’).

I’m using curl_multi (via the Rolling Curl X class which I’ve adapted slightly) to process 20 sites in parallel (which seems to give the fastest results) and CURLOPT_NOBODY to make sure only headers are collected and I’ve tried to streamline the script to make it as fast as possible.

It is working OK and I can process 40 sites in approx. 2-4 seconds. My plan has been to run the script via cron every minute… so it looks like I will be able to process about 600 websites per minute. Although this is fine at the minute it won’t be enough in the long term.

So how can I scale this? Is it possible to run multiple crons in parallel or will this run into bottle-necking issues?

Off the top of my head I was thinking that I could maybe break the database into groups of 400 and run a separate script for these groups (e.g. ids 1-400, 401-800, 801-1200 etc. could run separate scripts) so there would be no danger of database corruption. This way each script would be completed within a minute.

However it feels like this might not work since the one script running curl_multi seems to max out performance at 20 requests in parallel. So will this work or is there a better approach?

Source: stackoverflow-php