DateTime convert from UTC to another Timezone in Java – Simple way

I am not an advanced Java programmer but I brought some pieces from everywhere and build a small things that helped me to work with one of my practice project easily. Don’t be frustrated to change DatTime from UTC to another timezone in Java. I wrote it with simplicity.

//DateTime.java
package com.shaharia.javautil;

import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.Date;
import java.util.TimeZone;

public class DateTime {
    public static String changeDateTime(String dt, String timezone) throws ParseException {
        SimpleDateFormat sdfOriginal = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
        sdfOriginal.setTimeZone(TimeZone.getTimeZone("UTC"));
        Date date1 = sdfOriginal.parse(dt);

        Calendar calendar = Calendar.getInstance();
        calendar.setTime(date1);
        SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss a");
        sdf.setTimeZone(TimeZone.getTimeZone(timezone));
        return sdf.format(calendar.getTime());
    }
}

I just created that class in Java. And now let’s see how to use it.

package com.shaharia.javautil;

import java.text.ParseException;

public class TestDateTime {
    public static void main(String[] args) {
        
        String input = "2018-04-18 18:00:50";   //yyyy-mm-dd HH:mm:ss
        
        try {
            String t = DateTime.changeDateTime(input, "Asia/Dhaka");
            System.out.println(t);  //It outputs 2018-04-19 00:00:50
        } catch (ParseException e) {
            e.printStackTrace();
        }
    }
}

Write your comments if it works for you or if you face any issue, please let me know.

Mass URL change in WordPress from MySQL Database

I just spent 2 of my hours to figure this thing for my client’s WordPress website. One of our client’s site somehow got redirect after changing the URLs (from where it has been changed, he even doesn’t know that). So it was my job to fix that. And of course I fixed that, but in a very engineered way.

I first took a note which table and in which column I have that old url. Then I just build some MySQL query and executed that from phpMyAdmin.

I wrote the following queries to run to replace all old url to new url. You can do it for not only URL for any kinds of texts. But, DON’T forget to take a backup of your database.

UPDATE wp_options SET option_value = replace(option_value, 'http://my-old-url.com', 'http://new-url.com') WHERE option_name = 'home' OR option_name = 'siteurl';

UPDATE wp_posts SET post_content = replace(post_content, 'http://my-old-url.com', 'http://new-url.com');

UPDATE wp_postmeta SET meta_value = replace(meta_value,'http://my-old-url.com','http://new-url.com');

UPDATE wp_usermeta SET meta_value = replace(meta_value, 'http://my-old-url.com','http://new-url.com');

UPDATE wp_links SET link_url = replace(link_url, 'http://my-old-url.com','http://new-url.com');

UPDATE wp_comments SET comment_content = replace(comment_content , 'http://my-old-url.com','http://new-url.com');

UPDATE wp_posts SET guid = REPLACE(guid, 'http://my-old-url.com', 'http://new-url.com')

I wrote queries for only 2 tables but I created another few queries for you so based on TABLE and COLUMN you can replace your OLD VALUE with NEW VALUE by yourself.

The query format for who is not familiar with MySQL.

UPDATE TABLE SET COLUMN_NAME = REPLACE(COLUMN_NAME, 'YOUR_OLD_VALUE', 'YOUR_NEW_VALUE')

With this simple tips, you can save lots of your time for this specific use case scenario.

BigQuery Tips & Tricks [part – 1]

BigQuery Tips & Tricks

BigQuery Tips & Tricks – Part 1

I am becoming a fan of BigQuery recent days. It’s saving me a lots of time while of our business platforms are running on Google Cloud Platform. Every application, every operation we are keeping logs from every corner of our infrastructure as much as possible that gives us a greater good insight about everything. All logs are being stored in BigQuery. It’s awesome, completely automated and scaled.

With Google’s Data Studio, our project managers, business analysts and every other department (with or without any technical knowledge) can build their own reports and get more insight about every piece of components that’s making our business great.

Google Data Studio & BigQuery

So sometimes, I need to be able to practice some tips and tricks to get more complex insight, data transformation and other small stuffs to make the whole ecosystem more useful. So this articles I am keeping here for my own practice purpose so whenever I forgot any little things, I can find that from here. This BigQuery tips and tricks will be also helpful for you.

Convert timezone in query

All the BigQuery timestamps are in UTC timezone. If you want to change the timestamp field in another timezone. Just follow this

SELECT DATETIME(timestamp, "Asia/Dhaka") from table

Parse and Access JSON string with JSON Path

Assume, you have some logs or some data in JSON format and you want to access properties from inside the JSON string. You should use JSON_EXTRACT() or JSON_EXTRACT_SCALAR() function.

SELECT JSON_EXTRACT('{"job_slugs":"I am a test Job slug"}', '$.job_slugs') as job_slugs from table

It can work based on JSON path. You can evaluate your JSON path from http://jsonpath.com. It’s awesome

Remove double quotes from beginning and end of the string

Let’s say, you extracted some data and it’s looks like ” String inside double quotes”. Now it’s making you trouble to work with that string. Simply follow this

SELECT
 REGEXP_REPLACE('"String inside double quotes"', r'^"|"$', '') AS unquoted_string from table

And you will get String inside double quotes without the double quotes. It’s a simple tricks with REGEX_REPLACE() bigquery function.

Also read my other BigQuery related posts from here

Check whether URL match with Wildcard pattern

Suppose, you have some rules to check whitelisted url. So everytime whenever you will get a new url so you can cross match that URL with your whitelisted URL. And whitelisted URLs sometimes can be wildcard url. So to check whether the given URL matched any of the wildcard urls.

<?php
$whiteListedUrls = [
    "https://google.com/test/*",
    "https://yahoo.com/test/index.html",
    "https://sohelrana.me/*",
    "https://sohelrana.me"
];

$url = "https://google.com/test/testWildCard/ok.html";


function checkUrlWildcard($url, $whiteListUrls = [])
{
    foreach ($whiteListUrls as $wUrl) {
        $pattern = preg_quote($wUrl, '/');
        $pattern = str_replace('\*', '.*', $pattern);
        $matched = preg_match('/^' . $pattern . '$/i', $url);
        if ($matched > 0) {
            return true;
        }
    }

    return false;
}

//it will return true
var_export(checkUrlWildcard($url, $whiteListedUrls));

https://google.com/test/* this url means actually all url path after https://google.com/test/. So if we want to check https://google.com/test/testWildCard/ok.html whether it is covered with our rules or not. So it will return true of course.

It’s a simple PHP script. But may help somebody.

Copy BigQuery dataset to another dataset – copy all tables at once

Sometime it’s necessary to copy or clone the entire Big Query dataset to another dataset. If you Big Query dataset has lots of table, then it’s very time consuming to clone or copy all tables one by one.

But for the help of Shell script, we can combine few `bq`  command and can make our life easier for this types of bulk tasks. I just googled to get similar solutions and combined few things in a single shell script.

Here is the script to copy or clone the entire Big Query dataset in another dataset.

Shell Script to copy Big Query Dataset

#!/bin/sh
export SOURCE_DATASET="BQPROJECTID:BQSOURCEDATASET"
export DEST_PREFIX="TARGETBQPROJECTID:TARGETBQDATASET._YOUR_PREFIX"
for f in `bq ls -n TOTAL_NUMBER_OF_TABLES $SOURCE_DATASET |grep TABLE | awk '{print $1}'`
do
  export CLONE_CMD="bq --nosync cp $SOURCE_DATASET.$f $DEST_PREFIX$f"
  echo $CLONE_CMD
  echo `$CLONE_CMD`
done

In the above command –nosync will help you to load bq job asynchronously. So you don’t need to wait for current job to finish. Also TOTAL_NUMBER_OF_TABLES you need to replace with existing dataset’s total number of tables you want to copy. i.e: 200

Now, run the above shell script and start kicking the bq job payload.