Planet maemo: category "feed:dc2d42ffa90d409ad35691447d64bb45"

madman2k

If you use your laptop as a desktop replacement, you will at some point get an external display and a mouse/ keyboard for more convenient usage.
At this point the laptop becomes only a small case of non-upgradable components.

Click to read 2116 more words
Categories: Articles
madman2k

Streaming the Screen on Android

2014-09-11 22:37 UTC  by  madman2k
0
0

In this post I want to discuss way of getting the screen content of your Android device to the TV or monitor. If you wonder why one might want to do such a thing – just think about playing some Android games with a bluetooth gamepad or watching a movie where your PC is not available.

Specifically I want to introduce SlimPort. SlimPort is a feature of Nexus devices which is unfortunately not covered much in reviews.
Basically SlimPort is DisplayPort over the Micro-USB connection of your device allowing you to mirror its display.

But the future has arrived: we got Miracast!

One might wonder why one should go through the hassle of using a old-school HDMI cable.
You can get a Chromecast Stick for 35$ and nowadays it also supports Miracast so you can simply stream the images over WiFi.

Well Miracast is all nice if all you need to do is to put up some slides without carrying all possible adapters with you. But as soon as you try to stream a movie or a game you will reach its limitations.

Remember that Miracast works by grabbing the Framebuffer and compressing it with H.264. While encoding happens in hardware it still takes some time and it inevitably introduces compression artifacts. This means:

  • in games you get a noticeable lag – especially in FullHD
  • in movies you get noticeable artifacts – especially in FullHD
  • in both cases your battery will get drained for heavy WiFi and Encoder usage
Going old-school

Going with the old-school cable on the other hand you get HDMI 1.4 transfer rates for up to 1080p at 60Hz while saving the battery.

Configuring the second screen is quite straightforward in android. As Mirroring is your only option, there is actually nothing to configure. Once you connect the adapter android will set up your monitor based on its EDID information and transfer image and audio over HDMI.
In case you only want to have the image over HDMI, simply attach your speakers to the phone and android will re-route the audio.
The days where you had to manually set up everything are over.

Furthermore most adapters have an micro-USB port allowing to still charge your phone while using SlimPort.

Device Support

The downside is that most of the devices do not support SlimPort. The device list more or less boils down to

  • Google Nexus 4/ 5
  • Google Nexus 7 (2013)
  • LG G2/ G3

Samsung devices go with the alternative MHL. Comparing these two SlimPort has the bandwidth advantage of 5Gb/s vs. 3Gb/s of MHL so it does not have to compress that much. However both are clearly better than going wireless.

 

Categories: News
madman2k

Secure Owncloud Server

2014-04-18 19:46 UTC  by  madman2k
0
0

This article is about how to securely configure the machine where your Owncloud instance will be running.
Even if you set-up your connection with Owncloud in a secure way,  your data still can be compromised by exploiting security flaws in the underlying architecture.

In the following we specifically will cover the underlying software stack and brute-force password hacking attempts.

Automatically install security updates

No software package is perfect – there might be security holes in the the whole stack – starting with the linux kernel up to the used SSL library.
However most security holes that are being exploited are publicly known and security updates have been provided for them.
The only reason why they still can be exploited is that people do not install the security updates in time. Especially if there is no server-admin dedicated to maintaining the Owncloud machine one might easily miss on such updates.

Fortunately it is very easy to enable automatic security updates on debian based distributions with

sudo dpkg-reconfigure -plow unattended-upgrades
Prevent brute-force password hacks

Unfortunately Owncloud 6 is still vulnerable to brute-force password attacks in its default configuration as it does not enforce timeouts after failed login-attempts.

Therefore one might just try all possible passwords gain access to your machine in about 3 days for a typical password length:

To prevent this we can use fail2ban to enforce a timeout after a certain number of failed login attempts.

First install fal2ban

sudo apt-get install fail2ban

Fail2ban works by parsing the log files of a service and then reconfiguring the firewall in order to ban the offending ip-address.

So we need to tell owncloud to log the failed login attempts. To do so edit

owncloud/config/config.php

  'logtimezone' => '<TIMEZONE>',
  'logfile' => '/var/log/owncloud.log',
  'loglevel' => '2',
  'log_authfailip' => true,

Note that logtimezone must match the clock of your server.

Next create the following filter definition for fail2ban

/etc/fail2ban/filter.d/owncloud.conf

[Definition]
failregex={"app":"core","message":"Login failed: user '.*' , wrong password, IP:<HOST>","level":2,"time":".*"}
          {"app":"core","message":"Login failed: '.*' \(Remote IP: '<HOST>', X-Forwarded-For: '.*'\)","level":2,"time":".*"}

The top row is for owncloud <= 7.0.1. The bottom row for owncloud 7.0.2

Together with the following service definition

/etc/fail2ban/jail.local

[owncloud]
enabled = true
filter  = owncloud
port    = https
logpath = /var/log/owncloud.log

Now restart fail2ban and try to log in 4 times with a wrong password. The 4th attempt should give you a timeout. (for 15min)

Categories: Articles
madman2k

How to manually update a deb package from source

2014-03-15 12:03 UTC  by  madman2k
0
0

Probably everyone has encountered a package in Ubuntu which was not the newest released version while one for some reason needed the newest one. The first step is to search for a PPA with the desired version. But what if there is no such PPA or you want to build the version yourself? This is where this guide comes in. Note however that this is not aimed at ordinary users – you need some experience with programming/ compiling to successfully build a package.

Before you start

Before you start make sure that you have source packages enabled in your software sources.
Next you obviously need the upstream source tar-ball of the new program which should look something like <packagename>-<version>.tar.gz.
Download this tar-ball to a new directory <somedir> and extract it there.

Updating Package info

For the following commands I assume you are in the previously created directory <somedir>.

First we need to get the old version of the source package

apt-get source <packagename>

This will download and extract the old source package into <packagename>-<oldversion>.

Now we need some helper scripts to perform the upgrading as well as the build-time dependencies of the package

sudo apt-get install dpkg-dev devscripts fakeroot
sudo apt-get build-dep <packagename>

Next change into the extracted sources of the old package and update the packaging

cd <packagename>-<oldversion>
uupdate -v <newversion> ../<packagename>-<newversion>.tar.gz

# change into the extracted new package
cd ../<packagename>-<newversion>

# update version info
dch -l ~ppa -D $(lsb_release -sc)

For more information see the Debian New Maintainers Guide.

Building the program

To trigger a rebuild of the program simply execute

dpkg-buildpackage
Uploading your version to a PPA

To upload a package to a PPA you first need to sign it to prove that you are the author. To do this you have to execute the following in the <packagename>-<newversion> directory

debuild -S

Furthermore you need the upload tool dput to actually perform the uploading

sudo apt-get install dput

Now change to <somedir> and execute

dput ppa:<your_username>/<repository> <source.changes>

You can find more information at Launchpad.

Categories: Articles
madman2k

Secure Owncloud setup

2014-02-22 10:53 UTC  by  madman2k
0
0

While the Owncloud Manual suggests enabling SSL, it unfortunately does not go into detail how to get a secure setup. The core problem is that the default SSL settings of Apache are not sane as in they do not enforce strong encryption. Furthermore the used default certificate will not match your server name and produce errors in the browser.

Click to read 1780 more words
Categories: Articles
madman2k

How to root Android using Ubuntu

2014-01-12 13:14 UTC  by  madman2k
0
0
The Big Picture

Android consists of three parts relevant to rooting

Click to read 1922 more words
Categories: Articles
madman2k

I recently ran into this problem and could not find any good solution on the Internet. So next comes a small summary of the problem with hopefully enough buzzwords, so Google can lead you here.

If you want to do C++ development on Android, you need the NDK for cross compilation. It comes by default with its own build system called ndk-build, which basically is a bunch of custom makefiles. But if you are sharing code between the Android Platform and lets say plain Linux, you have likely already a build system installed. For C/C++ CMake is quite popular as it supports different platforms and compilers. Fortunately there is already a project which adds Android support to CMake. I will not cover that – instead I assume you are using it already.

Unfortunately you cant use the ndk-gdb script supplied with the NDK to debug your application as it relies on the behaviour of ndk-build. But as said earlier, ndk-build is no wizardy, but just a bunch of scripts. So it is possible to emulate the behaviour using CMake, as following:

Add the following macro to your CMakeLists.txt file

macro(ndk_gdb_debuggable TARGET_NAME)
    get_property(TARGET_LOCATION TARGET ${TARGET_NAME} PROPERTY LOCATION)
    
    # create custom target that depends on the real target so it gets executed afterwards
    add_custom_target(NDK_GDB ALL) 
    add_dependencies(NDK_GDB ${TARGET_NAME})
    
    set(GDB_SOLIB_PATH ${PROJECT_SOURCE_DIR}/obj/local/${ANDROID_NDK_ABI_NAME}/)
    
    # 1. generate essential Android Makefiles
    file(WRITE ${PROJECT_SOURCE_DIR}/jni/Android.mk "APP_ABI := ${ANDROID_NDK_ABI_NAME}\n")
    file(WRITE ${PROJECT_SOURCE_DIR}/jni/Application.mk "APP_ABI := ${ANDROID_NDK_ABI_NAME}\n")

    # 2. generate gdb.setup
    get_directory_property(PROJECT_INCLUDES DIRECTORY ${PROJECT_SOURCE_DIR} INCLUDE_DIRECTORIES)
    string(REGEX REPLACE ";" " " PROJECT_INCLUDES "${PROJECT_INCLUDES}")
    file(WRITE ${PROJECT_SOURCE_DIR}/libs/${ANDROID_NDK_ABI_NAME}/gdb.setup "set solib-search-path ${GDB_SOLIB_PATH}\n")
    file(APPEND ${PROJECT_SOURCE_DIR}/libs/${ANDROID_NDK_ABI_NAME}/gdb.setup "directory ${PROJECT_INCLUDES}\n")

    # 3. copy gdbserver executable
    file(COPY ${ANDROID_NDK}/prebuilt/android-arm/gdbserver/gdbserver DESTINATION ${PROJECT_SOURCE_DIR}/libs/${ANDROID_NDK_ABI_NAME}/)

    # 4. copy lib to obj
    add_custom_command(TARGET NDK_GDB POST_BUILD COMMAND mkdir -p ${GDB_SOLIB_PATH})
    add_custom_command(TARGET NDK_GDB POST_BUILD COMMAND cp ${TARGET_LOCATION} ${GDB_SOLIB_PATH})

    # 5. strip symbols
    add_custom_command(TARGET NDK_GDB POST_BUILD COMMAND ${CMAKE_STRIP} ${TARGET_LOCATION})
endmacro()

Then use it like

add_library(YourTarget ...)
ndk_gdb_debuggable(YourTarget)

You should now be able to use ndk-gdb with CMake, just as if you would have used ndk-build.

Note that steps 4 and 5 are optional for debugging. They just reduce the size of the library that has to be transferred to the device. If you dont care, you can just leave them out. But then the solib search path from step 2 must be set to:

file(WRITE ./libs/${ANDROID_NDK_ABI_NAME}/gdb.setup "set solib-search-path ./libs/${ANDROID_NDK_ABI_NAME}\n")

Ideally someone should integrate that in the Android toolchain linked above.

Update Merged Upstream

Categories: Articles
madman2k

GNOME Project suffering the NIH disease

2011-12-10 14:03 UTC  by  madman2k
0
0

When I first read about GNOME dropping support for BSD and Solaris, my impression was that this is a good idea to aiming to unify limit resources and get the work done. I was also excited about the idea of the GNOME OS. I think it is necessary to keep the big picture in mind when developing the different components. Previously Ubuntu was the only project that did this and it was also the reason why I started using Ubuntu. Because it made the different parts of Linux work together to achieve the big goal of a great overall system.

Click to read 1318 more words
Categories: Articles
madman2k

Doing the right thing

2011-03-02 11:53 UTC  by  madman2k
0
0

Canonical is doing the right thing. Yes morally as well. By choosing the MIT/X11 license instead of the GPL the Banshee developer explicitly allow using Banshee in a closed-source for-profit project without giving back anything.

To start whining about moral, now that someone actually takes advantage of this right is somehow premature – in the end you had the choice how to license it, right? If you don’t like what happens change the license! Maybe a proprietary one this time, as open source obviously is not restrictive enough for you and you have to resort to “morality”.

As for me I would be perfectly happy if Canonical would simply keep 100% of the Amazon revenue – after all its their product (yes putting together the pieces makes it something new).

As a user I care most whether the product works and I use ubuntu as it works best for me. And since canonical did a great job so far providing what I want, I think the decision should be up to them whether to spend the money on shiny new icons or to give something back to the banshee developers.

For reference: this and this.

Categories: News
madman2k

Augmented Reality on the N900

2010-06-16 10:22 UTC  by  madman2k
0
0

finally I reached a stage where I could upload my small augmented reality app to extras-devel, so all those who asked for it can now play with it. But be aware that it is in extras-devel for a reason. In case you are wondering what I am writing about, here is a video of the demo:

in order to make it work, you will have to print the artoolkitplus markers. Furthermore there are these controls:

  • scale the objects using the volume buttons
  • select one of the objects for scaling by tapping on it
  • tapping on the palette symbol triggers annotating by drawing on the screen
  • tapping on the sun symbol fixes the sun to the current device position
  • once fixed the shadows can be rotated using the arrow keys
Categories: News
madman2k

Handheld based interaction using AR

2010-02-25 15:49 UTC  by  madman2k
0
0

it is time for the next demo of my project, as I reached the beta status. (feature complete) I think you can see quite clearly now where this is going and which kind of interactions will be possible using this technique. The concrete features are described in the annotations.

If you use more advanced tracking methods and add some physics to this, you could easily port numpty physics to this kind of interaction or create an easy to use level editor. In case you missed my last video, here is the link.

It will still take some weeks until this hits maemo-extras as there are still some bugs left and I still want to get rid of keyboard use for interaction.

Categories: News
madman2k

Thoughts about MeeGo

2010-02-18 15:47 UTC  by  madman2k
0
0

In a country with freedom of speech, one has to say something to every happening, right? So here is my try:

Basically the merge of Maemo and Moblin is logical and consequent as Nokia and Intel already collaborated with ofono and merging helps joining the efforts. This is quite necessary as neither Maemo nor Moblin could survive on their own in a world where everybody else uses Android and soon will start using Chrome OS. There was also a need to make Moblin more like Meamo to be able to compete with the iPad.

That sounds greet so far, right? Joint efforts, bigger community, open to everybody… But the problem is the way the merge is going to happen. Moblin is more or less a huge techdemo so far – everybody who I know uses Ubuntu Netbook Remix on their Netbook, as it is more production ready and end user oriented. The same also applies to Maemo.

It is a bit sad that the next Maemo/ MeeGo Harmattan will be Qt based though, at it means that all the currently working applications have to be rewritten without gaining an immediate benefit. But considering that Qt is technically more advanced than GTK and that it allows deploying your application on the different OS Nokia uses this is understandable.

What is less understandable is that MeeGo will be based on RPM/ Moblin/ Fedora. And at least for Fedora the official motto is merging new features as fast possible, which is nice for developers but less nice for end users, as the distribution is less stable. So while it is logical to base a tech-demo like Moblin on Fedora, I would not base anything that is supposed to be stable on it.

But this is exactly what shall happen with MeeGo. This means Maemo has to abandon its Debian roots and rebase everything to RPM. By everything I mean the huge amount of packaging experience gained during the last 5 years, the build infrastructure and of course the core package management applications. This has also an impact on the community infrastructure, downloads, karma are coupled to the DEB format too.

So what do we gain by rebasing to RPM? Maybe the Moblin interface which is indeed nice? Actually no, as it is Clutter/GTK based while MeeGo will use Qt – besides the Moblin interface was packaged by Ubuntu as deb too. Ok the Moblin community will not need to change its infrastructure, but is the Moblin community actually that big?

As I really wonder why we switch to RPM I started a wiki to collect the arguments, and as it does not look too good for RPM also a brainstorm vote.

Categories: News