I have an old post about how to change the document colors in Okular, and it gave no toggle options. Here I post an updated version for you who might be want to invert the colors of your PDF files in Okular (I myself always need this kind of toggling functions).
By the way, my OS is Ubuntu 16.04 and the Okular's version is 0.24.2. Just for your reference.
First, go to ``Settings --> Configure Shortcuts...''. In the setting window, you could search for the wanted options. I used ``color'' to find the ``Toggle Change Colors'' option as follows:
Choose ``Custom''. When you move your mouse to the ``None'' icon and wait for seconds, it popped out the instruction to add your own shortcut keybindings:
I used ``Ctrl+I'' as the shortcut:
After the setting, you can press ``Ctrl+I'' to toggle between the normal and the inverted color scheme:
thk's note
Something about my own studies, programming notes, and also English writing exercise.
Sunday, December 18, 2016
Friday, September 11, 2015
Setup notes on Emacs and Common Lisp
I recently thought about Lisp again, and found last time when I was planning to learn Lisp has been six years ago... O__O
This time I will not to learn Lisp with Vim, but with Emacs.
I am following Practical Common Lisp and found there's a so-called Lisp in a Box package, but I decided not to use it now. My plan is to set up a working environment in Ubuntu using more common approaches.
Here are something I have done so far, and they are left here as a note.
This time I will not to learn Lisp with Vim, but with Emacs.
I am following Practical Common Lisp and found there's a so-called Lisp in a Box package, but I decided not to use it now. My plan is to set up a working environment in Ubuntu using more common approaches.
Here are something I have done so far, and they are left here as a note.
- Installed Emacs, Slime, and CLisp: $sudo apt-get install emacs slime clisp
- Created ~/.emacs.d/slime/ (to be checked...)
- Add ~/.emacs with the content (copied from websites, some functions are not clear to me):
- After adding MELPA, installed auto-complete by M-x package-install
- Also installed evil which enable ``evil-mode'' for Vim-like key strokes
- Use M-x slime to start it and the clisp
;; MELPA
(when (>= emacs-major-version 24)
(require 'package)
(add-to-list
'package-archives
'("melpa" . "http://melpa.org/packages/")
t)
(package-initialize))
(setq inferior-lisp-program "clisp")
;; Setup load-path, autoloads and your lisp system
(add-to-list 'load-path "~/.emacs.d/slime/")
(require 'slime-autoloads)
;; Also setup the slime-fancy contrib
;;(add-to-list 'slime-contribs 'slime-fancy)
;;(setq tab-always-indent 'complete)
;;; for auto-complete
(require 'auto-complete)
(global-auto-complete-mode t)
;(add-to-list 'ac-modes 'lisp-mode)
(add-to-list 'completion-styles 'initials t)
Tuesday, June 16, 2015
Build PCL with OpenNI2 in Ubuntu 14.04
I have built PCL from source and installed it in my Ubuntu 14.04. Everything was fine till these days.
I wanted to rebuild one of my test code in which I used Xition as the input device and it needed OpenNI2. When running ``make'' the system complained that it couldn't find pcl/io/openni2_grabber.h. I checked and found the head file was in the source package but not in the corresponding system folder. So it was something wrong with the make files.
The solution is to turn on the flag BUILD_OPENNI2 which has OFF as the default value[1]. I used
I wanted to rebuild one of my test code in which I used Xition as the input device and it needed OpenNI2. When running ``make'' the system complained that it couldn't find pcl/io/openni2_grabber.h. I checked and found the head file was in the source package but not in the corresponding system folder. So it was something wrong with the make files.
The solution is to turn on the flag BUILD_OPENNI2 which has OFF as the default value[1]. I used
$ grep -r "BUILD_OPENNI2" .to find it was in the file of CMakeCache.txt. Then just changed the value of from OFF to ON and did make && sudo make install again.
Monday, January 05, 2015
My QuickBot test with IR sensors
By following the Coursera course, Control of Mobile Robots[1] by Dr. Magnus Egerstedt, I've built my first QuickBot in which the BeagleBone Black is the core.
The course had been closed in several months ago, but I didn't get enough leisure time (and proper mood... maybe...) to complete the robot until these weeks. The following photos show the QuickBot:
What I have done so far is to test the sensors and motors with the help of the testing code qb_test written by Mike Kroutikov. Because I've used the Sharp GP2Y0A41SK0F IR sensors which are different from the ones assigned in the course, I am not sure whether the output is identical to that given in the comment of the test code. Therefore, after some tests, I decided to modify the test code for the IR sensors to show the detected distances in centimetres[3].
I searched and found some conversion formulas for Arduino[2]:
The trend of the output seemed right, but the absolute values were far from accurate. I might have to do some tests to find my own formula.
What I failed in the test was the communication between the host PC (running pysimiam) and the QuickBot's BBB (running quickbot_bbb). I have Wi-Fi adaptor connected to the BBB and it worked perfectly for accessing the internet, but I just couldn't connect the BBB wirelessly from my PC. I decided to postpone this part.
My next plans are:
---
[1] Without signing up the course, you may not be able to view the content. Here is the play list on YouTube: https://www.youtube.com/playlist?list=PLp8ijpvp8iCvFDYdcXqqYU5Ibl_aOqwjr . It has only the video course, however.
[2] http://www.dfrobot.com/wiki/index.php/SHARP_GP2Y0A41SK0F_IR_ranger_sensor_%284-30cm%29_SKU:SEN0143
[3] My modified code is here: https://github.com/hiankun/qb_test.git
The course had been closed in several months ago, but I didn't get enough leisure time (and proper mood... maybe...) to complete the robot until these weeks. The following photos show the QuickBot:
Front view of my QuickBot |
Top view; the BBB is on the left side |
The mess of wires between the chassis |
I searched and found some conversion formulas for Arduino[2]:
source: http://www.dfrobot.com/wiki/index.php?title=File:Formulas.jpg |
The trend of the output seemed right, but the absolute values were far from accurate. I might have to do some tests to find my own formula.
What I failed in the test was the communication between the host PC (running pysimiam) and the QuickBot's BBB (running quickbot_bbb). I have Wi-Fi adaptor connected to the BBB and it worked perfectly for accessing the internet, but I just couldn't connect the BBB wirelessly from my PC. I decided to postpone this part.
My next plans are:
- To write some code to drive the motors with the IR readings as the reference;
- to change the battery packs to the Li-Poly one.
---
[1] Without signing up the course, you may not be able to view the content. Here is the play list on YouTube: https://www.youtube.com/playlist?list=PLp8ijpvp8iCvFDYdcXqqYU5Ibl_aOqwjr . It has only the video course, however.
[2] http://www.dfrobot.com/wiki/index.php/SHARP_GP2Y0A41SK0F_IR_ranger_sensor_%284-30cm%29_SKU:SEN0143
[3] My modified code is here: https://github.com/hiankun/qb_test.git
Friday, November 14, 2014
OpenCV ORB feature matching test
Code is available here: https://github.com/hiankun/orb_test
Here was my test on feature matching with ORB. In the video, the target image was shown in the left side and the real-time video was in the up-right corner.
The feature points on the target image matched to the target when there were no other textured objects. If any object has detected feature points, however, the matching relationship would be disturbed significantly.
I have not test the matching approach by using SURF or SIFT features. This will be the next step.
Furthermore, I will try to apply findHomography() and getPerspectiveTransform() to find the correct object.
Here was my test on feature matching with ORB. In the video, the target image was shown in the left side and the real-time video was in the up-right corner.
The feature points on the target image matched to the target when there were no other textured objects. If any object has detected feature points, however, the matching relationship would be disturbed significantly.
I have not test the matching approach by using SURF or SIFT features. This will be the next step.
Furthermore, I will try to apply findHomography() and getPerspectiveTransform() to find the correct object.
Adding my project to GitHub
Finally, I pushed my own test project on GitHub!!
Okay, I know it's not a big deal, but it's my first step to use git more than just a logging tool in local machines.
Actually I was slightly scared by the complexity of git, especially its mystical jargon such as rebase, merge, cherry-pick and so on.
I thought, however, the git shouldn't be just used as a logging tool locally. Also, I wanted to use GitHub as the code repository which would be convenient to people who want to check my code. By upload my code to GitHub, I don't need to copy and paste it in the blog post. That's a better and smarter approach to share source code.
Oh, before I dived into GitHub, I found the following links helped a lot:
1. try Git, an interactive tutorial to give you a basic understanding;
2. Adding an existing project to GitHub using the command line
Okay, I know it's not a big deal, but it's my first step to use git more than just a logging tool in local machines.
Actually I was slightly scared by the complexity of git, especially its mystical jargon such as rebase, merge, cherry-pick and so on.
I thought, however, the git shouldn't be just used as a logging tool locally. Also, I wanted to use GitHub as the code repository which would be convenient to people who want to check my code. By upload my code to GitHub, I don't need to copy and paste it in the blog post. That's a better and smarter approach to share source code.
Oh, before I dived into GitHub, I found the following links helped a lot:
1. try Git, an interactive tutorial to give you a basic understanding;
2. Adding an existing project to GitHub using the command line
Monday, November 10, 2014
Test OpenCV on BeagleBone Black using Logitech C920
Here are some basic trials in which I tested OpenCV running on my BBB.
The Logitech C920 webcam did some works which I don't know very well so that the loading on the BBB has been reduced significantly. Other webcams might too ``slow'' for the test program to run directly (I remember the terminal returned `select timeout' errors).
[NOTE: the following videos were boring and showed nothing exciting... :-p]
[Sorry for the small view of the videos. I will find how to enlarge them later...]
[I've uploaded the video clips so that you can view them with better resolution.]
The first one was the result shown via VNC. The image stream was laggy, but the processing time showed it was about 100 to 200 ms.
The second one showed the result without cv::imshow() via VNC. The processing time seemed reduced, but not significantly.
The last one showed the result without cv::imshow() via ssh. The processing time was no more than 100ms.
It seemed that the VNC costed most of the computing resource.
The test code was:
The Logitech C920 webcam did some works which I don't know very well so that the loading on the BBB has been reduced significantly. Other webcams might too ``slow'' for the test program to run directly (I remember the terminal returned `select timeout' errors).
[NOTE: the following videos were boring and showed nothing exciting... :-p]
[I've uploaded the video clips so that you can view them with better resolution.]
The first one was the result shown via VNC. The image stream was laggy, but the processing time showed it was about 100 to 200 ms.
The second one showed the result without cv::imshow() via VNC. The processing time seemed reduced, but not significantly.
The last one showed the result without cv::imshow() via ssh. The processing time was no more than 100ms.
It seemed that the VNC costed most of the computing resource.
The test code was:
#include <stdio.h> #include "opencv2/opencv.hpp" int main() { cv::VideoCapture cap(0); if(!cap.isOpened()) return -1; int64 e1,e2; double time; for(;;) { e1 = cv::getTickCount(); cv::Mat frame; cap >> frame; cv::imshow("match result", frame); e2 = cv::getTickCount(); time = (e2 - e1)/cv::getTickFrequency()*1000.; printf("time: %f ms\n", time); if(cv::waitKey(30) >= 0) break; } return 0; }
BeagleBone Black Wifi connection and VNC settings (Debian)
As mentioned in my previous post (Setting up the Wifi connection of BeagleBone Black (Angstrom)), I couldn't make Wifi connection with the Angsrom images, and decided to test Debian image again.
[A brief note for my first test with the Debian image]
Actually I installed the Debian image several days ago, and did set the Wifi connection successfully. When I tried to update the system, however, the terminal replied with error message said that my system had no enough space. So I went back to Angstrom. And now I'm back to Debian again.
The Debian image was large. After the installation, I checked the disk usage by typing ``df -h''... 93% used... It's crazy.
So, after reflash the on-board eMMC of BBB, I removed something like /usr/share/doc, Chromium, and even Vim (there's vi for later usage). This time, the system updated without errors.
The following paragraphs were my note on the Wifi settings of the BBB with Debian image, and then the VNC settings. Most of the steps were following the guide: BeagleBone Black Ubuntu WiFi Installation and Troubleshooting Guide.
PART 1 Wifi settings
Step 1: Make sure the system know the Wifi adaptor. Use ``lsusb'' and the system return the detected devices. In my case, it's the line with ``Ralink Technology, Corp.''
Step 2: Check ``/etc/rcn-ee.conf'' and make sure it contain the following settings:
Step 3: Edit ``/etc/network/interfaces'' with the following settings:
auto ra0Finally, restart the BBB system.
iface ra0 inet dhcp
wpa-ssid your_network_name
wpa-psk your_hashed_password
PART 2 VNC settings
This part was easy. All the steps were the same as that of my another previous post (Connect to Beaglebone Black via VNC client in Ubuntu 12.04). The only different setting was the starting command of the VNC server (the BBB):x11vnc -bg -o %HOME/.x11vnc.log.%VNCDISPLAY -auth /var/run/lightdm/root/:0 -forever
Friday, November 07, 2014
Two copies of clang_complete makes vim throw error messages
Recently I found there was error messages when I open C/C++ files in Vim (see the following photo).
So I googled, and installed so-called libclang-dev. After the installation, the error message became more terrible:
XDD
Fortunately, I spotted a thread of clang_complete in GitHub, and the author's comment was the solution for me.
I must had installed the clang_complete manually and then it has been installed again by Vundle. :-p
---
Other useful link found during the search:
clang_complete can be configured to use the clang executable or the clang library
So I googled, and installed so-called libclang-dev. After the installation, the error message became more terrible:
XDD
Fortunately, I spotted a thread of clang_complete in GitHub, and the author's comment was the solution for me.
I must had installed the clang_complete manually and then it has been installed again by Vundle. :-p
---
Other useful link found during the search:
clang_complete can be configured to use the clang executable or the clang library
Subscribe to:
Posts (Atom)