Find the answer to your Linux question:
Results 1 to 7 of 7
I have a web page that uses curl to communicate with a credit card processing gateway. The curl arguments specify the web page to open on success and another for ...
Enjoy an ad free experience by logging in. Not a member yet? Register.
  1. #1
    Just Joined!
    Join Date
    Jun 2006
    Posts
    12

    wget web_page that uses curl


    I have a web page that uses curl to communicate with a credit card processing gateway. The curl arguments specify the web page to open on success and another for failure. When I run it from a browser, it works perfectly. However, I need to run it periodically and am attempting to use wget in the crontab to launch the page. The problem is that the reply that is expected to be received by a browser, instead gets written to a text file in the root directory. Is there an alternative to wget that will act as a browser to receive the reply?

  2. #2
    Linux User
    Join Date
    Dec 2009
    Posts
    264
    Quote Originally Posted by steveheflin View Post
    I have a web page that uses curl to communicate with a credit card processing gateway.
    U've whatt??

    Quote Originally Posted by steveheflin View Post
    The curl arguments specify the web page to open on success and another for failure.
    Whatt??

    Quote Originally Posted by steveheflin View Post
    The problem is that the reply that is expected to be received by a browser, instead gets written to a text file in the root directory. Is there an alternative to wget that will act as a browser to receive the reply?
    ... yes wget or curl ...

  3. #3
    Just Joined!
    Join Date
    Jun 2006
    Posts
    12
    I gather from the response that my post was too terse, so here's the verbose version.

    We are processing credit card charges from a web server. We run an "authorization only" charge first to make sure that the card is valid. Our customers are not charged until the service is delivered, so a short time later we need to "delete authorization" to release the funds that the credit card company holds against the card's credit after an authorization only. I have a web page that queries the database for authorizations that have not been deleted yet, and uses curl to communicate with a credit card processing gateway. Curl is the only available interface to the Credit Card Server.

    The curl data string specifies a web page to open on success and another for failure. The curl sequence looks like this:

    Code:
    			$fields = array(
    				'ssl_merchant_id'=>$sys->merchant_login_id,
    				'ssl_user_id'=>$sys->merchant_username,
    				'ssl_pin'=>$sys->merchant_pin,
    				'ssl_show_form'=>'false',
    				'ssl_result_format'=>'HTML',
    				'ssl_test_mode'=>'false',
    				'ssl_card_present' => 'N',
    				'ssl_receipt_apprvl_method'=>'REDG',
    				'ssl_receipt_decl_method'=>'REDG',
    				'ssl_partial_auth_indicator'=>'0',
    				'ssl_error_url'=>urlencode($ssl_error_url),
    				'ssl_receipt_apprvl_get_url'=>urlencode($url_to_handle_reply_from_credit_card_gateway),
    				'ssl_receipt_decl_get_url'=>urlencode($url_to_handle_reply_from_credit_card_gateway),
    				'ssl_transaction_type'=>'ccdelete',
    				'ssl_invoice_number'=>$cc->ChargeID,
    				'ssl_txn_id'=>$txn_id_to_be_deleted
    			);
    			
    			//initialize the post string variable
    			$fields_string = '';
    			
    			//build the post string
    			foreach($fields as $key=>$value)
    			{
    				$fields_string .= $key . '=' . $value . '&';
    			}
    			rtrim($fields_string, "&");
    			
    			//open curl session
    			$ch = curl_init();
    			
    			//begin seting curl options, set URL
    			curl_setopt($ch, CURLOPT_URL, $url);
    			
    			//set method
    			curl_setopt($ch, CURLOPT_POST, 1);
    			
    			//set post data string
    			curl_setopt($ch, CURLOPT_POSTFIELDS, $fields_string);
    			
    			//these two options are frequently necessary to avoid SSL errors with PHP
    			curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
    			curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
    			
    			//perform the curl post and store the result
    			$result = curl_exec($ch);
    			
    			//close the curl session
    			curl_close($ch);
    When I run it manually from a browser, it works perfectly. However, I need to automate it and run it periodically. I attempted to use wget in the crontab to launch the page, but the reply that is expected to be received by a browser, instead gets written to a text file in the root directory. I look inside the text file and see a header that would have caused my "url_to_handle_reply_from_credit_card_gateway" web page to be called.

    Is there a way of accomplishing what I need?

  4. $spacer_open
    $spacer_close
  5. #4
    Linux User
    Join Date
    Dec 2009
    Posts
    264
    Hi,

    First a short information:
    curl is a tool to access an http or in your case a https server.

    So the most effective way to solve your problem would be to write a bash script that makes the request on the server himself ...

    If you wanna use the php skript you can run it from commandline, instead of useing a tool to access the local http server.

    Example:
    Code:
    user@server:~$ echo "<?php echo \"test\n\";?>" > test.php
    user@server:~$ php test.php
    test
    user@server:~$
    I don't know what information you wanna show/store after the access...
    As I see it you will need to add an entry into a database, remove a file, add a log, or something like that ... I can't find a case where you really want to echo the output in the command-line.

    Hope that Info helps you,

    best regards, Zomby

  6. #5
    Just Joined!
    Join Date
    Jun 2006
    Posts
    12
    I appreciate your answer, but you are missing the point that the credit card server calls a URL with the response. The URL receiving the response needs to parse the arguments appended to the call. The credit card server will call the response URL with something like: response_url?first_param=1st_answer&second_param=2 nd_answer and so on. As I see it, only a browser can be used. I was hoping that wget would act like a browser but I'm obviously way off base.

  7. #6
    Linux User
    Join Date
    Dec 2009
    Posts
    264
    Hi, how are you doing?

    Have you figuerd something out yet?

    You propaly mean hat wget er curl won't execute any JavaScript ...

    Maybe lynx will execute it ... it still is the wrong way ...

  8. #7
    Just Joined!
    Join Date
    Jun 2006
    Posts
    12
    Quote Originally Posted by zombykillah View Post
    Hi, how are you doing?
    Maybe lynx will execute it ... it still is the wrong way ...
    I have concluded that it is not possible to do what I want. Instead, we are periodically launching the webpage manually

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •