Mass URL change in WordPress from MySQL Database

I just spent 2 of my hours to figure this thing for my client’s WordPress website. One of our client’s site somehow got redirect after changing the URLs (from where it has been changed, he even doesn’t know that). So it was my job to fix that. And of course I fixed that, but in a very engineered way.

I first took a note which table and in which column I have that old url. Then I just build some MySQL query and executed that from phpMyAdmin.

I wrote the following queries to run to replace all old url to new url. You can do it for not only URL for any kinds of texts. But, DON’T forget to take a backup of your database.

UPDATE wp_options SET option_value = replace(option_value, '', '') WHERE option_name = 'home' OR option_name = 'siteurl';

UPDATE wp_posts SET post_content = replace(post_content, '', '');

UPDATE wp_postmeta SET meta_value = replace(meta_value,'','');

UPDATE wp_usermeta SET meta_value = replace(meta_value, '','');

UPDATE wp_links SET link_url = replace(link_url, '','');

UPDATE wp_comments SET comment_content = replace(comment_content , '','');

UPDATE wp_posts SET guid = REPLACE(guid, '', '')

I wrote queries for only 2 tables but I created another few queries for you so based on TABLE and COLUMN you can replace your OLD VALUE with NEW VALUE by yourself.

The query format for who is not familiar with MySQL.


With this simple tips, you can save lots of your time for this specific use case scenario.

BigQuery Tips & Tricks [part – 1]

BigQuery Tips & Tricks

BigQuery Tips & Tricks – Part 1

I am becoming a fan of BigQuery recent days. It’s saving me a lots of time while of our business platforms are running on Google Cloud Platform. Every application, every operation we are keeping logs from every corner of our infrastructure as much as possible that gives us a greater good insight about everything. All logs are being stored in BigQuery. It’s awesome, completely automated and scaled.

With Google’s Data Studio, our project managers, business analysts and every other department (with or without any technical knowledge) can build their own reports and get more insight about every piece of components that’s making our business great.

Google Data Studio & BigQuery

So sometimes, I need to be able to practice some tips and tricks to get more complex insight, data transformation and other small stuffs to make the whole ecosystem more useful. So this articles I am keeping here for my own practice purpose so whenever I forgot any little things, I can find that from here. This BigQuery tips and tricks will be also helpful for you.

Convert timezone in query

All the BigQuery timestamps are in UTC timezone. If you want to change the timestamp field in another timezone. Just follow this

SELECT DATETIME(timestamp, "Asia/Dhaka") from table

Parse and Access JSON string with JSON Path

Assume, you have some logs or some data in JSON format and you want to access properties from inside the JSON string. You should use JSON_EXTRACT() or JSON_EXTRACT_SCALAR() function.

SELECT JSON_EXTRACT('{"job_slugs":"I am a test Job slug"}', '$.job_slugs') as job_slugs from table

It can work based on JSON path. You can evaluate your JSON path from It’s awesome

Remove double quotes from beginning and end of the string

Let’s say, you extracted some data and it’s looks like ” String inside double quotes”. Now it’s making you trouble to work with that string. Simply follow this

 REGEXP_REPLACE('"String inside double quotes"', r'^"|"$', '') AS unquoted_string from table

And you will get String inside double quotes without the double quotes. It’s a simple tricks with REGEX_REPLACE() bigquery function.

Also read my other BigQuery related posts from here

Check whether URL match with Wildcard pattern

Suppose, you have some rules to check whitelisted url. So everytime whenever you will get a new url so you can cross match that URL with your whitelisted URL. And whitelisted URLs sometimes can be wildcard url. So to check whether the given URL matched any of the wildcard urls.

$whiteListedUrls = [

$url = "";

function checkUrlWildcard($url, $whiteListUrls = [])
    foreach ($whiteListUrls as $wUrl) {
        $pattern = preg_quote($wUrl, '/');
        $pattern = str_replace('\*', '.*', $pattern);
        $matched = preg_match('/^' . $pattern . '$/i', $url);
        if ($matched > 0) {
            return true;

    return false;

//it will return true
var_export(checkUrlWildcard($url, $whiteListedUrls));* this url means actually all url path after So if we want to check whether it is covered with our rules or not. So it will return true of course.

It’s a simple PHP script. But may help somebody.

Copy BigQuery dataset to another dataset – copy all tables at once

Sometime it’s necessary to copy or clone the entire Big Query dataset to another dataset. If you Big Query dataset has lots of table, then it’s very time consuming to clone or copy all tables one by one.

But for the help of Shell script, we can combine few `bq`  command and can make our life easier for this types of bulk tasks. I just googled to get similar solutions and combined few things in a single shell script.

Here is the script to copy or clone the entire Big Query dataset in another dataset.

Shell Script to copy Big Query Dataset

for f in `bq ls -n TOTAL_NUMBER_OF_TABLES $SOURCE_DATASET |grep TABLE | awk '{print $1}'`
  export CLONE_CMD="bq --nosync cp $SOURCE_DATASET.$f $DEST_PREFIX$f"
  echo $CLONE_CMD
  echo `$CLONE_CMD`

In the above command –nosync will help you to load bq job asynchronously. So you don’t need to wait for current job to finish. Also TOTAL_NUMBER_OF_TABLES you need to replace with existing dataset’s total number of tables you want to copy. i.e: 200

Now, run the above shell script and start kicking the bq job payload.

Supercharge WP REST API with Google App Engine

It seems that I am pretty late here to play with WordPress REST API. It’s really super cool! Now I can think about how well and fast I can distribute or work with wordpress blog posts. Even now you can host your whole WordPress inside a static HTML page. Funny? Yea, like your HTML page will interact with your REST API json response and small JS can help you to display inside page. Then you will get 10000000x super fast blog post delivery. Trust me! It works. You will understand if you are a little techie guy.

I was thinking about if I can easily get all of my posts’s data as REST API Json response then easily I can use them wherever and whenever I want. But there must be a trick. Cause still this REST API is working from inside your main WP hosting.

But what if (IF) we can recruit some middleman who can hold data and deliver those data without even interacting with your original WordPress site. So you will save lots of money and your site’s performance will be super boosted.

I just run an experiment (I shared my experimented snippet here) with Google App Engine. Why Google App Engine? Ask that to google yourself. Google App Engine is the best solutions for ready-set-go types of app deployment with lots of features that Google maintains themselves. So I am going to write a small PHP code which will load your site’s REST API JSON data and after that rest of the request it will server from directly GAE memcache. So, more easily, if your site has 1M audience a day, then most probably you will hit your original WP site once, and 999999 user will get that data from directly App Engine (PHP app data persistent through memcache).

I just tried it and it works and I tried to do 50,000 hits and it just took few milliseconds to deliver me my wordpress site’s data from Google App Engine and I just designed a HTML page with simple JS that will load the content from my App Engine and will display inside HTML page, and you know HTML page you can serve billions user in a minute. Voila 🙂

So here is my little PHP script and also I uploaded the full GAE implementation in a GitHub repository so you can directly download, change your site’s URL and deploy to Google App Engine.

 * Serve remote URL content through App Engine by using Memcache. Pretty interesting.
 * It's really cool!
 * @Author Shaharia Azam <>
define("KB_JSON_URL", "https://YOUR-WEBSITE-ADDRESS/wp-json/wp/v2");

$memcached = new Memcached();
    ['localhost', 11211, 33]

 * @param $url
 * @param string $method
 * @return mixed
function fetchJSONFromRemoteWP($url, $method = 'GET')
    $curl = curl_init();
    curl_setopt_array($curl, array(
        CURLOPT_URL => $url,
        CURLOPT_TIMEOUT => 30,
        CURLOPT_CUSTOMREQUEST => $method,

    $data = curl_exec($curl);

    return $data;


$cacheKey = base64_encode($remoteUrl);

$content = $memcached->get($cacheKey);
if (!empty($content)) {
    $data = $content;
} else {
    $data = fetchJSONFromRemoteWP($remoteUrl);
    $memcached->set($cacheKey, $data, 360000);

header('Content-Type: application/json');
echo json_encode(json_decode($data, true));


And here is the app.yaml configuration for Google App Engine

runtime: php55
api_version: 1

service: kb-wp-reverse-proxy

threadsafe: true

- url: /assets
  static_dir: public/assets

- url: /*
  script: public/index.php
  secure: always

- url: /.*
  script: public/index.php
  secure: always

- ^.git$

This snippets are available on GitHub. If you face any trouble and want to optimize your site and make it super-fast and stable with this types of solutions, reach me anytime at