我喜欢女孩的静谧

女孩的柔韧

喜欢发呆

喜欢傻笑

喜欢做鬼脸

喜欢孫燕姿

喜欢不被看见

喜欢这首歌

 

DSC_1179

Trip

This is exciting!

Lifelong trip of being forward and positive!

How come the decision made so late?!

Never stop running!

Never stop being fast!

B-)

Shell

In my hand,

You are son of the ocean,

Son of the stars.

You perceive everything,

Transform all surfaces to an understanding.

Like a shell,

With messages of the mother of all beings,

You come here, and warm up everything.

Interactively Update Uniforms of Custom GLSL Shader in Blender Game

An easy way of interactively updating uniform variables of custom GLSL shader is using game properties as the values of uniform variables, and updating the game properties from keyboard input which triggers the update of the game frame. For example, to accumulate a floating point number `verticalSunDirection_cameraSpace` in fragment shader, the steps go as the following:

  • Add a game property `verticalSunDirection` to the object the shader is attached to.

add-game-property

  • Assign the uniform variable `verticalSunDirection_cameraSpace` by the game property `verticalSunDirection`. For debugging purpose, we print the updated value of `verticalSunDirection` when assigning it to the uniform at line 619:

assign-gameproperty-to-uniform

If you are curious about the try-except block, please refer to [1] for explanation in the “Logic” part of “Simple endless scroll” section =)

  • Add a keyboard sensor to add a fixed value 0.1 to the game property `verticalSunDirection` while the game is running (game is triggered key `p`). Here we use the up arrow key:keyboard-sensor
  • Test the updated value of game property `verticalSunDirection`:
    • Run the game by key `p`.
    • Hit the up arrow key, then the game console prints the updated `verticalSunDirection`:

game-console

References

[1] Scrolling Textures with GLSL. https://whatjaysaid.wordpress.com/2015/03/09/bge-scrolling-textures-with-glsl

The Girls

Rains are gloomy,

But I am hearing the bloom.

In this misty morning of April,

I am missing you, the girls in my life.

Like flowers,

Your smiles brightened the sky,

Deeply planted into flourishing lives.

Writing for the girls in this world. It’s my luck of seeing your beauties. In some moments, thank you for stopping by in my life, enriched with your smiles.

Sophie

April 6, 2017. Cambridge, MA.

女孩

淅淅瀝瀝

漸奏漸緩

雨霧濛濛的四月清晨

聽見花兒綻放的聲音

 

女孩兒們的笑靨

悄無聲息

卻是生命不斷前行的依賴

 

以此銘記這世上的女孩兒們,看見你們的美麗是我的幸運。在某些瞬間,謝謝你們路過,在我的生命途中留下花兒一般的笑靨。

Sophie

2017年4月6日,劍橋

羽翼

愛的痕跡

可以是單薄的曲線

也可以是陽光下的塵埃

 

交集

是無奈的錯過

 

我的眼睛

怎麼看不到那些色彩斑斕的塵埃

那些漂浮著的,美麗的羽翼

 

可是我的生命

需要羽翼的力量

 

我會不斷書寫

以親吻羽翼的饋贈

我會承受痛苦

以守護對你的承諾

 

就像清澈的琴聲與柔軟的提琴相互交織

我終於得知

模糊可以是一種確定

一股不朽的生命力

 

就像你

通曉陽光穿梭的縫隙

就像我

呼吸於這模糊的縫隙

 

致 最親愛的婧

我生命中的缺憾 因為你的羽翼 得以完整

姐姐,2017年3月27日夜,劍橋

 

 

 

That Snow

Slowly, following the light whites floating down, quietly on the ground.

I may not be as free as the flakes.

The mind drives, but is set free with the following.

I will give.

I will forget.

Because, even the snow just fresh breathing, then, melt gently.

Add Face Module to opencv2.framework

When it comes recognizing face but you realize the module is not in opencv, what to do? Cry? Maybe. I spent 1.2 days to figure out that the face module is not in that repository. Instead, it’s in opencv_contrib. Even worse, we can’t use Podfile to generate the framework to include a single module in opencv_contrib!

Not much blogs say that. So I’m recording. Here’re the steps to manually add a single module in opencv_contrib to opencv2.framework:

  • Download opencv from https://github.com/opencv/opencv
  • Download opencv_contrib from https://github.com/opencv/opencv_contrib
  • Copy the face module in opencv_contrib (opencv_contrib/modules/face) to opencv’s modules folder (opencv/modules).
  • Build opencv2.framework [2]:
    1. Make symbolic link for Xcode to let OpenCV build scripts find the compiler, header files etc.
      cd /
      sudo ln -s /Applications/Xcode.app/Contents/Developer Developer
    2. Build opencv2.framework
      cd ~/
      python opencv/platforms/ios/build_framework.py ios
  • If everything’s fine, a few minutes later you will get ~//ios/opencv2.framework. You can add this framework to your Xcode projects [2].
  • Add the following builtin frameworks to the project because opencv2.framework is dependent on those:
    • AssetsLibrary.framework
    • CoreMedia.framework
    • AVFoundation.framework
  • Done! Now you can compile and run the project with face recognition included!

References

[1] Integrating armco in opencv_contrib into opencv2.framework. https://github.com/opencv/opencv/issues/6530.

[2] Installation in iOS. http://docs.opencv.org/2.4/doc/tutorials/introduction/ios_install/ios_install.html#ios-installation.