Monday, 24 October 2016

VS Code for GO - get a function list

I have been using VS Code to design my GO code ever since my last MacBook died of caffeine poisoning (don't ask).

I had been using UltraEdit since 1998 and had found nothing that really changed my mind but I had been exploring VS Code for Ethereum work.

The new MacBook was supposed to be a temporary fix but four months on and no sign of the old MacBook back, it is looking quite permanent now.

So -what do I like about it? A lot. And I am learning more every day.

Today's tip

UltraEdit used to have the option of showing a list of functions in a separate pane. I could not find this in VS Code. Here is how to do it.

Cmd+P pops up the action entry box

Type a '@' in this box and you get a list of the imports, variables, functions etc. in your current file.

Selecting one of these takes you to the correct place in the file.

Sunday, 4 September 2016

POSTGRES - Change money field to Integer....

So, we had a table with a money field when all of a sudden we realised that it would make a lot more sense for the amount to be expressed in cents instead of in dollars.

$ 1234.56 would now become 123456

I am sure that there is a far better way to do this but - I am an occasional DB admin and so I figured that I needed to

  1. Convert dollars to cents
  2. Convert Money to Integer
#1 was easy : update bills set amount=amount*10;


#2 was a bit more tricky. Very hard to convert Money to Int.

amount::varchar does, however, yield a string so

Create a new int column:

Alter table bills add column iamount int;

Then copy the correct amount into it:

update bills set iamount =
     replace(substr(amount::varchar,2,length(amount::varchar)-4),

     ',','')::int;

Then drop the old column and rename the new....


alter table bills drop column amount;
alter table bills rename column iamount to amount;

Thursday, 30 June 2016

Compiling GO code for the BeagleBone Black

One thing I love about GO is that you can cross compile code for different platforms. 

The Ethereum Foundation use this feature to create developer releases of GETH (the ethereum node written in GO) to many platforms relatively quickly.

It also means that I do not need to wait ages for a slow processor like the Raspberry Pi 1 or the BeagleBone Black to make the build, I can do it from my MacBook.

I first learned how effortless it was from Audrey Lim's go-snap project

There are more details about cross compiling for ARM in the GoArm Wiki

So finally - to compile on Linux or a MacBook for the BBB*

# GOARCH=arm GOOS=linux GOARM=7 go build myBBBprogram.go

Once it has finished you should have a cross compiled executable. All you have to do is to transfer it to the BBB with scp or an sftp utility and you are in business!

*or on a PC if you really have to...

Set Fixed IP Address on BeagleBone Black (Debian)

After a couple of years of serious distractions, I dragged my BeagleBone Blacks out of the cupboard and am putting them to use hosting test software in the office so that people can access them while I am away.

I downloaded the latest Debian because it seems that Arch had not been updated recently

Many things have changed, in particular the connection manager but it is a very simple process.

These commands may require running as root (via Sudo or su)

First find your service name (I have truncated the names of mine for brevity)

connmanctl services
your_network_id            wifi_000c


Then set the IP address, netmask and gateway address

#connmanctl config wifi_000c --ipv4 manual 192.168.0.200 255.255.255.0 192.168.0.1

If you have logged in via SSH and are entering the commands remotely, the BBB will now freeze because you have already changed the IP address and so your connection is frozen.

Reboot and your connection is up on the new address.

Tuesday, 5 April 2016

Getting Started with Ethereum

It has become clear that Cryptocurrencies are not so much about using the currency as a means of saving or speculation, more about using their key features on which to build a platform that has value and doing so in a more secure, flexible and inexpensive way than were possible without the technology.

The result of this is that, while you still see many people mining Bitcoin, stashing it and praying for the value to increase back to the glory days, there are many more services that buy and re-sell bitcoin almost instantaneously in different markets, using the technology as a way of transferring value around the world at the speed of light.

But still, Bitcoin was a currency. Ripple, Stellar and their spinoffs moved this forward by being an asset trading platform. The ripple and stellar currencies were merely the oil for the machine, the value being the incentive for miners to keep the wheels of the processing nodes turning.

Stellar has some excellent features. Automatic exchanges can be set up - and peer exchanges - anybody who deals in two assets may propose exchange rates - and the best will win. But at the end of the day, they are still rigid systems and you need to build a system to take advantage of them.

A couple of years ago there were (if you will pardon me for saying it) ripples in the ether. A paper had been written proposing what were called "smart contracts".

Smart contracts are small programs inserted into a blockchain which get run every time somebody wants to interact with them. People who want to use a smart contract have to pay for the privilege because thousands of people worldwide are expected to offer computing power to run these programs. The programs themselves are ultimately very flexible - written in languages similar to Javascript or Python.

At the time of writing Ethereum, the result of all this work, is starting to take the crypto world by storm. While still officially in Beta, people are using it to manage all kinds of things - from auctions and trading systems to online gambling.

OK, I hear comments. That is a huge amount of verbosity. What's the point?

The main point is that I am exploring Ethereum and have been doing a bit of coding. Some of it digs quite deep into the Ethereum source Code.

In the process, I am building some useful libraries which I will be sharing in a way that allows us to explore the power of Ethereum.

Expect to hear from me soon.

Sunday, 4 October 2015

Processing 3 is out - but to get it to work with Android on the Mac takes a bit of trouble

Processing - one of my favourite tools for PC side development has just had new release. Unfortunately is does seem a bit raw at the moment.

You can download Processing3 from the usual site http://www.processing.org and you can find the recommended setup information at https://github.com/processing/processing-android/wiki but a number of people, myself included, seemed to be having trouble getting it to work.

After some digging on the forums I did manage to get it working so I thought I would summarise what had to be done.

1) If you have Android Studio or Eclipse installed, locate the Android SDK folder otherwise you will need to download it (SDK only) as described in the wiki above.

2) Not sure if this is 100% correct but from the wiki you will need to load API 10. You can use preferences from Android Studio or run the android app in the tools folder to do so.

3) I did add the variables in the .bash_profile but that did not work for me. Further digging led me to this post : http://forum.processing.org/two/discussion/12665/android-sdk-could-not-be-loaded

summarizing:

  1. open processing preferences, note the path of preferences.txt, probably
    /Users/<your name>/Library/Processing/preferences.txt
    obviously <your name> needs to be replaced by your user name.
  2. Close Processing
  3. Edit preferences.txt and add the following two lines at the beginning
    android.sdk.path=/Users/<your name>/Library/Android/sdk
    android.sdk.version=10
That seemed to do it for me. Haven't tried using an SDK > 10 even though somebody has claimed that it may work and haven't run a full test - but not getting the error must be a good start.

Sunday, 7 September 2014

Loading an array/slice of objects from a JSON file in Golang

As usual, with a new project arriving my first question is "what can we learn on this one?"

So, having found that Go (aka Golang) is a decendant of Modula-2 and has some pretty nifty features, I decided that Golang was the way to Go (if you will pardon the expression).

The next thing that I do is head straight for the target, reading just enough to get me there. If the journey is a happy one I may flesh things out later, but when you are relying on the internet and googling, the search results are a bit like flicking through a book - but not as good because sometimes a book yields a gem that you weren't looking for.

The project is made of multiple servers (raspberry pi) doing crazy stuff all talking to each other as well as hardware and Arduinos. The first server was a sound player controlled by other remote servers. It loaded a name->filename map from a JSON file. Not too tricky.

The second one needed some routing tables loaded. I wanted pure object arrays loaded from JSON.

Note : You can't use an Array, you need a SLICE. Arrays are fixed size.

I Googled and came across lots of people on StackOverflow asking the same question.

A lot of the answers and questions revolved around interface{} which I tried and got working but it seemed a bit tricky because there are multiple type conversions involved - silly when all records are the same type. And there seemed to be NOBODY telling how to do it.

So, if you are trying to load an array slice of objects from a JSON file - bearing in mind I have been doing this for a couple of weeks with a lot of that time goofing off, here it is....

The data record I want to load

type command_record struct {
Source    string
What      string
Condition string
Value     string
Host      string
Action    string
}

Which we will represent in JSON as

[
    {
"Source": "sensor",
"What": "temp",
"Condition": "GT",
"Value": "35",
"Host": "sensor",
"Action": "REDON"
    },
    {
"Source": "host",
"What": "sensoralarm",
"Condition": "NIL",
"Value": "0",
"Host": "sensor",
"Action": "BLUOFF"
    }

 ]

So the slice is represented as 

var sensor_script *[]command_record

Note the * which means we are only declaring a pointer.

And we read the file into a slice of bytes

file, e := ioutil.ReadFile("./sensor.json")
if e != nil {
fmt.Printf("File error %v\n", e)
os.Exit(1)
}

Then we convert it objects using json.Unmarshal and print it just to show it works

json.Unmarshal(file, &sensor_script)
for _, v := range *sensor_script {
fmt.Println(v.Source)
fmt.Println(v.What)
fmt.Println(v.Condition)
fmt.Println(v.Value)
fmt.Println(v.Host)
fmt.Println(v.Action)
fmt.Println("=========")
}

And, after all that time - it is that simple. And I am sure that many many people have figured it out within ten minutes, but others have too google it and waste a lot of time. This is for them.

 The code and a sample JSON file can be found on Github at https://github.com/DaveAppleton/LoadObjectSliceFromJson