## Create a search…then speed it up by a factor of 50!

One great way to implement a basic search is to put all your data fields into one big array and then perform the search on this one field. Let’s say, you’ve got a Customer model. Open up customer.rb (in app/models/) and write this few lines of code:

class Blueprint < ActiveRecord::Base
scope :search, ->(search){ where('keywords LIKE ?', "%#{search.downcase}%") }

before_save :set_keywords

protected
def set_keywords
self.keywords = [name, email, birthdate, etc.].map(&:downcase).join(' ')
end
end

To pick only the searched items we’re creating a scope to which we’re passing the search argument. You can add a function, which will turn all separate fields into one big keywords array.

Now, open up your Controller (CustomersController.rb)

class CustomersController < ApplicationController
@customers = Customer.search(params[:search])
end

If desired, you can add some validation for cases if no search parameter is provided.

This approach won’t yield you amazon like results, but can bring one pretty far.

## Step up the game

If you want to make one little step further and do not want to bother with “heavy industry” tools like ElasticSearch, then you can utilize PostgreSQL’s awesome built in indexing features. This approach requires a bit more setup, than the first one, but once you follow those steps, it will appear straight forward.

The main goal is to write our custom SQL query. In the end what we want to achieve is something like this:

SELECT
*
FROM
customers
WHERE
lower(first_name) LIKE 'bob%' OR
lower(last_name)  LIKE 'bob%' OR
lower(email)      LIKE 'bob@example.com'
ORDER BY
email = 'bob@example.com' DESC,
last_name ASC

Using rails syntax this would mean we need a query like this:

Customer.where("lower(first_name) LIKE :first_name OR " +
"lower(last_name) LIKE :last_name OR " +
"lower(email) = :email", {
first_name: "bob%",
last_name:  "bob%",
email: "bob@example.com"
}).order("email = 'bob@example.com' desc, last_name asc")

To create this query and keep things clean I’m using an extra class (models/customer_search_term.rb):

class CustomerSearchTerm
def initialize(search_term)
search_term - search_term.downcase
@where_clause = ""
@where_args = {}
if search_term =~ /@/
build_for_email_search(search_term)
else
build_for_name_search(search_term)
end
end

#Following code goes there
end

Now we need this helper function, “build_for_name_search” (I’ll omit the first one for now).

def build_for_name_search(search_term)
@where_clause << case_insensitive_search(:first_name)
@where_args[:first_name] = starts_with(search_term)

@where_clause << " OR #{case_insensitive_search(:last_name)}"
@where_args[:last_name] = starts_with(search_term)

@order = "last_name asc"
end

def starts_with(search_term)
search_term + "%"
end

def case_insensitive_search(field_name)
"lower(#{field_name}) like :#{field_name}"
end

This function does nothing else rather than building the string for our SQL query. You can verify it by examining the variables – @where_clause is a string, while @where_args is a hash.

Finally, let’s build our controller:

class CustomersController < ApplicationController
def index
if params[:keywords].present?
@keywords = params[:keywords]
customer_search_term = CustomerSearchTerm.new(@keywords)
@customers = Customer.where(
customer_search_term.where_clause,
customer_search_term.where_args).
order(customer_search_term.order)
else
@customers = []
end
end
end


We’re almost there. This search works, but it’s still a bit slow (see below for speed results).

Now we need to create custom indexes on those tables. We will stick to old up and down migration methods, since rails won’t understand our custom SQL migration if we’d stick to “change”.

class AddLowerIndexesToCustomers < ActiveRecord::Migration
def up
execute %{
CREATE INDEX
customers_lower_last_name
ON
customers (lower(last_name) varchar_pattern_ops)
}

execute %{
CREATE INDEX
customers_lower_first_name
ON
customers (lower(first_name) varchar_pattern_ops)
}

execute %{
CREATE INDEX
customers_lower_email
ON
customers (lower(email))
}
end

def down
remove_index :customers, name: 'customers_lower_last_name'
remove_index :customers, name: 'customers_lower_first_name'
remove_index :customers, name: 'customers_lower_email'
end
end

It’s important to use “varchar_pattern_ops”, since we are not using an exact match, but instead using a like operator. Now, if we fire up rails dbconsole and perform EXPLAIN ANALYZE on both methods, we can see the difference.Ama

This was the original result:

Started GET "/customers?utf8=%E2%9C%93&keywords=andres148127%40dicki.biz&commit=Find+Customers" for ::1 at 2017-01-04 22:19:21 +0300
Processing by CustomersController#index as HTML
Parameters: {"utf8"=>"✓", "keywords"=>"andres148127@dicki.biz", "commit"=>"Find Customers"}
User Load (0.3ms)  SELECT  "users".* FROM "users" WHERE "users"."id" = $1 ORDER BY "users"."id" ASC LIMIT 1 [["id", 1]] Customer Load (561.3ms) SELECT "customers".* FROM "customers" WHERE (lower(first_name) like 'andres%' OR lower(last_name) like 'andres%' OR lower(email) like 'andres148127@dicki.biz') ORDER BY lower(email) = 'andres148127@dicki.biz' desc, last_name asc Rendered customers/index.html.haml within layouts/application (572.4ms) Completed 200 OK in 591ms (Views: 27.8ms | ActiveRecord: 561.6ms) After spicing things up: Started GET "/customers?utf8=%E2%9C%93&keywords=andres148127%40dicki.biz&commit=Find+Customers" for ::1 at 2017-01-04 22:40:59 +0300 Processing by CustomersController#index as HTML Parameters: {"utf8"=>"✓", "keywords"=>"andres148127@dicki.biz", "commit"=>"Find Customers"} User Load (0.2ms) SELECT "users".* FROM "users" WHERE "users"."id" =$1  ORDER BY "users"."id" ASC LIMIT 1  [["id", 1]]
Customer Load (2.1ms)  SELECT "customers".* FROM "customers" WHERE (lower(first_name) like 'andres%' OR lower(last_name) like 'andres%' OR lower(email) like 'andres148127@dicki.biz')  ORDER BY lower(email) = 'andres148127@dicki.biz' desc, last_name asc
Rendered customers/index.html.haml within layouts/application (12.7ms)
Completed 200 OK in 32ms (Views: 28.3ms | ActiveRecord: 2.3ms)


This is an increase by a factor of 243!

## Eve-trader.net – the new source for EVE Online productioneers

Greetings, capsuleers!

Few days ago I launched my long time dreamed project – eve-trader.net (http://eve-trader.net). This services is intended to be a lightweight alternative to updating your excel sheets daily with current prices (most likely you will get them from eve-central.com). Via the eve-central.com API I fetch the prices on requested blueprints, calculate for you the cheapest material prices and suggest where you can sell the end product most expensive.

Settings and assumptions as of today:

• Areas considered for calculation are all solar systems within 5 jumps around the five main trade hubs (which currently are – Jita, Amarr, Rens, Dodixie, Hek) and which are high sec
• All skills are lvl 5 (which you probably should have if you considering make money with industry)
• No fees and taxes are calculated (if they are hindering you to make profit you are producing the wrong thing)

The service is in it’s early stage, so be prepared to encounter some changes in near future. What is planned:

• Having at least a few options available in an “expert” menu
• News section
• Implement user accounts where user can save their blueprints, thus, having a faster access to their favourite blueprints
• Having a history saved for each blueprint.
• A suggestions feature, where users can suggest new things and other users can vote on them.
• Making market predictions based on Monte-Carlo-Simulation or Expected Shortfall methods.

## Properly create and restore your development postgreSQL on production

So maybe you are like me, who is parsing a lot of static data into a development database, make an app around this data and then you don’t want to have an empty database on production, but instead you want to have a full copy of your development database.

After spending a few hours trying to make it work, there is one solution, that yielded exactly 0 errors (yay!)

On your local machine (in the shell, wherever you currently are) :

pg_dump --no-owner your_db_develoment > backup.bak

Then put it in some safe place on your remote machine:

scp backup.bak deploy@192.168.111.222:/home/restore/backup.bak

Make a backup of your production database for safety sake (on your production server, of course)

pg_dump --no-owner your_db_production > /home/backups/04-03-2016/backup.bak

If there is no production database yet on the server, proceed with creating it, elsewise you might need (if the app has users) to cut the current sessions, so check these out:

http://stackoverflow.com/questions/12924466/capistrano-with-postgresql-error-database-is-being-accessed-by-other-users

or

http://stackoverflow.com/questions/1237725/copying-postgresql-database-to-another-server?rq=1

Ok, now we have no database on production machine at all. Let’s create new production database:

postgres=# CREATE DATABASE database_production;
CREATE DATABASE

postgres=# CREATE USER production_db_user WITH password 'qwerty';
CREATE ROLE

postgres=# GRANT ALL privileges ON DATABASE database_production TO production_db_user;
GRANT

Make sure to populate the password with the same password, as set up in database.yml for production user.

That’s it! No need to change the owner. Now let’s restore the backup file:

sudo psql -d mydb_production -U production_user < /home/deploy/restore/backup.bak;

SET
SET
SET
SET
SET
SET
CREATE EXTENSION
ERROR:  must be owner of extension plpgsql
SET
SET
SET
CREATE TABLE
CREATE SEQUENCE
ALTER SEQUENCE
CREATE TABLE
CREATE SEQUENCE
ALTER SEQUENCE
CREATE TABLE
CREATE SEQUENCE
ALTER SEQUENCE
CREATE TABLE
ALTER TABLE
ALTER TABLE
ALTER TABLE
COPY 14964
setval
--------
114041
(1 row)

COPY 3151
setval
--------
3236
(1 row)

COPY 28011
setval
--------
7081
(1 row)

COPY 8
ALTER TABLE
ALTER TABLE
ALTER TABLE
CREATE INDEX
CREATE INDEX
CREATE INDEX
ALTER TABLE
ALTER TABLE

As of now I have no issues regarding the one error above.

## Sneak peak – Team Geek

When communicating with other people (not only with your colleagues), remember the golden rules which I picked up reading this fairly amazing book (a complete review will follow later) :

• Your explanations should be covered by three paragraphs at most
• You should end with a call to action

It is amazing how this simple rule is not followed in most cases of everyday non-verbal conversation.

## Rails eShop – my first big project using Ruby on Rails

I finished my very first “serious” project – yay! It looks and behaves like a rails eShop (you can find link further down below). Before I go into technical details some intro – this website was created for friends of mine and although it was kind of experimental for me (first real app), it turned out to be quite decent as one might say so.

As with every customer-oriented project, there was a lot of JavaScript involved, which is by far not my strongest language. It started calm and nicely with pure ruby on rails, I could use a lot of stuff I learned from this book – Agile Web Development With Rails. But then… as more and more wished appeared I realized that there is no way around some front-end development.

Enough smalltalk, let’s get straight to the data:

• Bootstrap Theme used – “Minimal
• Notable gems: figaro (managing global variables), devise (users), carrierwave (images uploads), sendgrid (for sending any kinds of mails)
• Database: postgresql (I think it’s pretty a must-have for shops because of hstore – which, as far as I understand makes postgres almost non-relational, because hstore columns allow to store hash/value pairs)
• Deployment: capistrano, puma, nginx (you can check out my article about deployment here)
• Other features: protected admin namespace

JavaScript – a calculator for end value of a product based on amount, delivery method and size (plus additional services, which are to be differentiated between one for each product and one for a full order). There is some hard coded stuff and of course this piece of code might not be the state of the art, however it works pretty well.

$(document).ready -> if$('a.size_input').length != 0
$('a.size_input')[0].click() jQuery ->$('li a').click (event) ->
# event.preventDefault()
$('a').removeClass('active')$(this).addClass('active')

$('form').on 'click', '.add_fields', (event) -> time = new Date().getTime() regexp = new RegExp($(this).data('id'), 'g')
$(this).before($(this).data('fields').replace(regexp, time))
event.preventDefault()

get_multiplier = ->
amount = $('#amount').val() multiplier = 1.5 multiplier = switch when amount <= 99 then multiplier = 1.5 when amount >= 100 and amount < 300 then multiplier = 1.4 when amount >= 300 and amount < 500 then multiplier = 1.3 when amount >= 500 and amount < 1000 then multiplier = 1.25 when amount >= 1000 and amount < 3000 then multiplier = 1.23 when amount >= 3000 and amount < 7000 then multiplier = 1.2 return multiplier get_delivery = -> delivery =$('#delivery').val()
return parseFloat(delivery).toFixed(2)

get_size = ->
# in this method we either select the price value based on the flash
# drive size, or, if it has a basic price, just simply the basic price
# from the hidden #basicprice id

selected_size = $('input[name=volume]:checked').val() if (selected_size == undefined) return parseFloat($('#basicprice')[0].innerText)
else
array = selected_size.split(",")[1].replace(']','')
return parseFloat(array).toFixed(2)

# This method delivers all checked addservice price
# so we can first substract it before applying any multiplications

for checkbox in checkboxes

# Here we need to check if the 'apply to whole party is true or false'

if checkbox.checked

if party == 'true'
else

else
sum = 0

else
sumParty = 0

return [parseFloat(sum), parseFloat(sumParty)]

recalculate = ->
# This function recalculates the price
base = get_size()
deliveryCoefficient = get_delivery()
amountCoefficient = get_multiplier()

amount = $('#amount').val() console.log('base - ' + base + '| delivery - ' + deliveryCoefficient + ' | add service sum - ' + addservices + ' | amount coefficient - ' + amountCoefficient) endPriceOne = (base*deliveryCoefficient*amountCoefficient) + addserviceOne endPriceParty = (base * deliveryCoefficient * amountCoefficient) * amount endPriceAll = endPriceOne * amount + addserviceParty endPriceServices = (addserviceOne * amount) + addserviceParty$('#priceValue').text(endPriceAll.toFixed(2))
$('#priceForOne').text(endPriceOne.toFixed(2))$('#priceForAll').text(endPriceParty.toFixed(2))
$('#priceForAddservices').text(endPriceServices.toFixed(2))$('.size_input').click (event) ->
recalculate()

$('.addservice_checkbox').change (event) -> recalculate()$('#amount').on('input', (event) ->
recalculate()
)

$('#delivery').change (event) -> recalculate()$("#thediv").click (event) ->
$("#addservices").toggleClass("reveal-closed").toggleClass("reveal-open")$('#load_more_btn').click (event) ->
event.preventDefault()

Here we have some fancy helper methods which I grabbed from Ryan Bates screencasts. They allow you to add fields on the go:

module ApplicationHelper

def hidden_div_if(condition, attributes = {}, &block)
if condition
attributes["style"] = "display: none"
end
content_tag('div', attributes, &block)
end

new_object = f.object.send(association).klass.new
id = new_object.object_id
fields = f.fields_for(association, new_object, child_index: id) do |builder|
render(association.to_s.singularize + "_fields", f: builder)
end
end

we have of course a fully functional rails eShop with a working cart, order placement, sending confirmation letters to customers and admins, an administration panel for managing products etc. etc.

Unfortunately the owners decided to put this page on ice, therefore it’s unlikely that you will receive some flash drives. However you can check the page out all by yourself – http://souvenirnya-produktsiya.ru/

## The Software Development Edge – Book Review – Part III/III

Finally I gathered all the strength required to finish this review. It’s a little bit tougher than back in the school days, however I hope it might serve some educational purpose.

As explained in the last post, this review part will not continue the previous path, but summarize the whole book (in a shorter way). You can regard this as a kind “best of”.

## Core idea – Politics

[readolog_blockquote ]”Developing software is such a unique and complicated process, I can’t make an estimation how long it will take and I won’t take the responsibility for my due dates” – this is utter bullshit. Software development is just as unique as any other professional field (just ask your friends). If someone refuses to take over responsibility it’s is clearly not because he/she is such a genius.[/readolog_blockquote]

## Core idea – Crisis

[readolog_blockquote ]Act – that’s the main point. Regardless if you’re a hired crisis manager or were just moved to another project which is deep in the shit, your main goal now is acting. It will stink for a while, so don’t bother with keeping good relationships (with customers, other employees, anyone…).[/readolog_blockquote]

## Core idea – Project

[readolog_blockquote ]Keep your project within the given due date. Do it at any cost – mostly you will be cutting off unnecessary features. However you can’t fit any project in any time frames. Take your time in advance and get an honest opinion from your team members how they estimate the whole project. You really need those honest opinions and people you can trust. If you can’t, it’s time to say goodbye to those.[/readolog_blockquote]

## Core idea – Developing

[readolog_blockquote ]It might be some de facto standard right now, but still I see even in my team how things can quickly get out of control if you don’t invest time into concentrated continuous releases, called iterations. Force everyone to deliver something each week. It will require a lot endurance from you, but there is just no other way…[/readolog_blockquote]

I promised to keep it short and simple and here we go. Four core points which should awaken your interest in this book. If you want to intensify the review reading I would recommend you part I and part II, however more than that I would just suggest you go out there and buy this amazing book.

## How to write useful technical specs

I might be not that wrong in assuming that a lot of IT people out there encounter same problems regarding tech specs all the time – they become obsolete sooner than they are written. Mostly created to show the client – hey, we’re doing stuff here, here is the result, 100 pages of pure informational power. However, the result is often the same – after one glance, each team member will soon forget that this spec ever existed.

So how make specs more useful? In this article I won’t show how to formulate your spec, but more how (technically) to manage, that your spec stays up to date.

## Use LaTeX

No, not in the bedroom with your girlfriend (on the other side – who am I to judge you?), but the document markup language (according to Wikipedia).

LaTeX is not only cool – writing specs feels almost like writing code – but it allows you to maintain a healthy development process considering your spec.

### Problem No. 1

When your specs are starting to get bigger and you add stuff here and there, it is very (VERY!) hard for your team to follow up on changes. Using GIT in combination with LaTeX allows you to commit any changes just if it were code. And GIT markup fantastically highlights all changes that were made. So next time you can just give your team a link to the latest commit to they will see only the new and recent parts.

### Problem No. 2

Have you ever tried to create links within a document, say in MS Word? Yeah, you kinda able to do that, but if you add, say, a new point to your numerical list, all your links will get messed up. Regardless to say what happens if you add a whole new chapter at the beginning. LaTeX allows you to define variables, just like any programming language. Just use a simple

/label{section:intro}

where section:intro is just a clear variable meaning that your content is in the section called “intro”.

### Problem No. 3

As soon as you try to maintain your MS Word spec you realize how this program is clearly not created to write large documents. Ever tried to create a numerical list, then add a break and continue the list? Then you’ll know how easy it is to screw things up. By being completely controllable there is always an easy solution for almost everything you want to achieve with a LaTeX document.

## Use GIT

I mentioned it already in the previous paragraph, but I can’t stress it out enough – your LaTeX spec will really shine bundled with GIT. Not only you have a neat version control (and don’t need to save “project_x_tech_spec.v.1.0.docx”, “project_x_tech_spec.v.1.8.docx” and so on), an ability to link only your recent changes by sending a commit link, but you also have one place for your always up-to-date end format, because recently github learned how to display pdf-files.

So basically your pdf url stays always the same, something like this:

## Where to host code?

Trust me, even if you have just one developer you better start using version control. Github is the most common service just to do so, however if you can’t afford sharing your code with the community you got to pay for private repositories. A wide known alternative would be Gitlab, which not only allows you to host the service on your own machine, but also doesn’t charge you for creating private repositories if you continue using on-demand version. I didn’t manage it to install Gitlab on our 5$droplet (maybe I’m just not gifted… maybe it has something to do with recommended RAM size of 2 Gbyte and two unicorn worker units to be able to handle http requests – 5$ droplet got only 512 MB RAM), so we’re using private on-demand repos. However if you manage to install gitlab on your machine the following routine will be much easier, since pulling from Gitlab won’t requre any hooks as it will be on the same machine as Redmine itself.

## How to connect them?

You’ll probably don’t want maintaining two systems at a time. This is where so called hooks come in play. You can try and install https://github.com/phlegx/redmine_gitlab_hook but for me, although I set up (at least I suppose so) everything correctly, it didn’t work.  What I did instead was following:

• create an ssh-key on your server, where redmine is hosted (since chances are you won’t be able to copy the key to your clipboard you’ll need to send the file to your local machine first via scp
pbcopy < .ssh/id_rsa.pub
For example - scp root@192.168.0.1:/.ssh/id_rsa.pub /directory_created_for_the_key
git remote add origin git@gitlab.com:your_project/your_repo.git
git push -u origin master
*/5 * * * * app cd /path/to/project.git && git fetch origin && git reset --soft refs/remotes/origin/master > /dev/null