Copy BigQuery dataset to another dataset – copy all tables at once

Sometime it’s necessary to copy or clone the entire Big Query dataset to another dataset. If you Big Query dataset has lots of table, then it’s very time consuming to clone or copy all tables one by one.

But for the help of Shell script, we can combine few `bq`  command and can make our life easier for this types of bulk tasks. I just googled to get similar solutions and combined few things in a single shell script.

Here is the script to copy or clone the entire Big Query dataset in another dataset.

Shell Script to copy Big Query Dataset

#!/bin/sh
export SOURCE_DATASET="BQPROJECTID:BQSOURCEDATASET"
export DEST_PREFIX="TARGETBQPROJECTID:TARGETBQDATASET._YOUR_PREFIX"
for f in `bq ls -n TOTAL_NUMBER_OF_TABLES $SOURCE_DATASET |grep TABLE | awk '{print $1}'`
do
  export CLONE_CMD="bq --nosync cp $SOURCE_DATASET.$f $DEST_PREFIX$f"
  echo $CLONE_CMD
  echo `$CLONE_CMD`
done

In the above command –nosync will help you to load bq job asynchronously. So you don’t need to wait for current job to finish. Also TOTAL_NUMBER_OF_TABLES you need to replace with existing dataset’s total number of tables you want to copy. i.e: 200

Now, run the above shell script and start kicking the bq job payload.

mongodump usage for MongoDB backup and restore

According to MongoDB Backup manual, mongodump is a very handy tools for MongoDB administrator to backup and restore full mongodb cluster or single database or single collection.

mongodump usage

Here is a very simple command to take backup or restore mongodb system.

Backup


sudo mongodump --host hostName -d databaseName --port portNumber --username userName --password passWord --out directoryLocation

with the above command you can take backup of your database. I don’t need to explain all parameters as I wrote variable name very clearly.

Restore

 sudo mongorestore --host hostName -d databaseName --port portNumber --username userName --password passWord backupSourceLocation 

I guess it will help you quickly.