Posts

install postgreSQL in ubuntu 16.04

When following the Install PostgreSQL in Ubuntu  in Ubuntu 16.04, got error as below: $ apt-get install postgresql-9.6 Reading package lists... Done Building dependency tree Reading state information... Done You might want to run 'apt-get -f install' to correct these: The following packages have unmet dependencies: adobereader-enu:i386 : Depends: libgtk2.0-0:i386 (>= 2.4) but it is not going to be installed postgresql-9.6 : Depends: postgresql-client-9.6 Depends: postgresql-common (>= 171~) but it is not going to be installed Recommends: postgresql-contrib-9.6 but it is not going to be installed E: Unmet dependencies. Try 'apt-get -f install' with no packages (or specify a solution). The suggested solution in the last line and a normal ubuntu upgrade do not solve the issue,  and the cmd below can fix it: $sudo apt full-upgrade After the fix, then can apt-get to install postgresql again. After install postgresql,

install ipython in Cloudera VM using pip

I am practicing pyspark in Cloudera VM, and pyspack need be launched by ipython. In all the old Cloudera VM guide, it use easy_install to install ipython, but it does not work in Virtualbox 5.2.0 r118431 and later. And will get the error : $sudo easy_install ipython==1.2.1 ... error: Could not find suitable distribution for Requirement.parse('ipython==1.2.1') The solution for this is to use pip to install ipython. But by default, pip is not installed in cloudera VM, and pip can not be installed by eacy_install as well. And after some search, I  noticed cloudera has yum installed, and yum can install pip: $sudo yum install python-pip ... $which pip /usr/bin/pip Now ipython can be installed b pip: $sudo pip install ipython=1.2.1 ... Succesfully installled ipython-1.2.1 After that, launch spack by ipython: $PYSPARK_DRIVER_PYTHON=ipython pyspark ... Welcome to       ____                       __      / __ /__  ___ _____/  /__     _\ \/ _   \/  _  ` /  __/    '_ /

VBScript tips - excel macro to fill formula

Thanks for the post here , then I can now use the syntax highlight in google post.  As a programmer, takes several hours to figure out some vb usage. Some performances/concepts are different as normal programming language. So I summary some tips as below: The example snippet I implemented is as below, this macro will generate excel formula and fill them to the excel file. Sub FillReportSummary() ' ' FillReportSummary Macro ' ' Dim description As Variant description = Array("Type A", "Type B", "Type C", "Type D") Dim headerNum As Integer headerNum = UBound(description) + 1 'MsgBox ("headerNum: " & headerNum & description(0)) Dim headerFormulas() As Variant ReDim headerFormulas(1 To headerNum) Dim strFormulas() As Variant ReDim strFormulas(1 To headerNum) Dim calSumFormula As String 'With ThisWorkbook.Sheets("SheetName") With ActiveWo

docker usage tips

Docker is really a smart way to hold projects with different environment settings. Below are some basic but must know cmds when using docker: docker-compose is the name of the .yml file that used as the setting file for an app implemented in docker: 1. Start an app: sudo docker-compose up Notice when using like this, there is only one app info in the docker-compose file.  And then you will have the log console window keep running, means get the real time log info. sudo docker-compose up -d Notice -d option will run the docker container as a background app, and will not have server log info be displayed. 2. Stop an app: sudo docker-compose stop If any modification of the code, then need to "stop" and "up" the app.   3. Show the running containers: sudo docker ps Can get "CONTAINER ID", "IMAGE NAMES" and other basic info for all the running containers. 4. Show the server log file: sudo docker logs CONTAINER_ID Or use the docke

timestamp with 16, 13 and 10 digits to Qlik date

Sometimes, we need to convert the Linux timestamps to Qlik date, it can be done in the SQL script, and can also be done in the Qlik Load/expression. Linux timestamps with 16 digits are microseconds timestamps, and 13 digits are milliseconds, and 10 digits are second timestamps. To convert to a Qlik date, hour or time as following: As the timestamps starts from "19700101 00:00:00",  then this start timestamps needs be added: For 16 digits: Date(Date#('19700101', 'YYYYMMDD')+(SynchronousQuery_ Event_Time/1000000/86400) or  Date(25569+Round(SynchronousQuery_ Event_Time/1000000/86400),'YYYYMMDD') 25569 is the number format for date '19700101' For 13 digits: Date(Date#('19700101', 'YYYYMMDD')+(SynchronousQuery_ Event_Time/1000/86400) or  Date(25569+Round(SynchronousQuery_ Event_Time/1000/86400),'YYYYMMDD') For 10 digits: Date(Date#('19700101', 'YYYYMMDD')+(SynchronousQuery_ Event_T

Downgrade firefox using .deb file

Downgrade firefox using .deb file System: Ubuntu 14.04.5 Problems: I use firefox version 47 for a screenshot solution using selenium in python. After a system update, the firefox updated to latest version 48 (2016, Aug, 02), and raises a error: WebDriverException: Message: Requested size exceeds screen size There is a discussion about talking this bug here . Solution: Before he new bug -fix version is available, I decide to first down grade it to version 47. Steps: 1.  sudo apt-get purge firefox   sudo apt-get autoremove        This first will remove firefox, and the second will move all the dependencies that is not used anymore. I got this recommended by ubuntu after run  the first cmd. List the autoremoved packages below, in case any dependency is missed from other cmd : The following packages were automatically installed and are no longer required:   acl at-spi2-core colord dconf-gsettings-backend dconf-service fontconfig   fontconfig-config fonts-dejavu-core hicolo

run batch in task Scheduler rise problem in server 2008 R2

For a long time, I sometimes got (0x1) error from task scheduler in Windows server 2008 R2. Finally solve it by separating all the cmds in the batch file and add them one by one in the "Actions" in Task Scheduler. It seems that the task scheduler can not run a batch file correctly for some reason. But have no idea why.