Copyright Virgilio Lino 2011 Geozonedb. virgiliolino@gmail.com
Based on data taken from geonames.org
The framework and sql files are released under dual license,
Creative Commons Attribution 3.0 License and gpl license 2.
GeoZoneDB is an opensource project to load geozone data into sql
files, ready to download if you don't want to lose time. Database
has been conceived to be as fast as possible to retrieve data from
every Community, State, Country of the world, with population, postal
code, etc.
GeoZoneDB 0.5
geozonedb.php - world data extraction framework
geozonesearch.php - this is the only class you need if you plan to integrate geodata in your software or site.
An example of autosuggest is on file index.php please read comments on beginning of the file
I'm building a new dump in english, you'll just need to download it and include some files
to have all data in your software. I highly recommend to create a database just for geographical
data as it together with your database tables, it could slowdown everything.
Read the instructions for GeoZoneDB0.4 as are ok for 0.5
Please let me know if you like it to virgiliolino@gmail.com send me suggestions, or if you want TO HELP ME!!!!
I'm searching for every skill, even if you are not a programmer i need someone to create a dump of every world language,
i need some one to put files on my site, etc
GeoZoneDB 0.4
What is GeoZoneDB
--------------------------------------------------------------------------------------------------------------------
You may just download(its still not updated to version 0.4):
geozonedb_dump_data.tar.bz2 wich is an sql file, if you import it in your database you'll have
an optimized database with all the world zones, 2 country tables, and more than one hundred
zones comunity tables
To create a database with new 0.4 version do what follows:
0. Create a database with for example phpmyadmin or you may use an old database
To setup:
1. includes/configure.php
DB_SERVER : the server name: usually it's fine to live "localhost"
DB_SERVER_USERNAME : the username : the username to access the db
DB_SERVER_PASSWORD : the password of the user to access the db
DB_DATABASE : the name of the database
There is no need to create the tables in the database, tables will be automatically generated
usage:
to populate the countries data:
$geo=new geoZoneDB();
$geo->setLocale('IT'); // if setted data will not be loaded in english but in the language indicated
$geo->loadCountries();
to populate the zones data:
$geo=new geoZoneDB();
$geo->loadZones();
************************************************************************
HELP WANTED
I need help to optimize the database, i plan to integrate the framework
with Sphinx indexes.
I need help to improve the php framework to import data.
I need help to build a multilevel set of combos to navigate throw
country-state-community.
I plan to build on next release an autosuggest system instead of
multilevel set of combos.
please contact me at virgiliolino@gmail.com
---------------------------------------------------------------------------------
Changes:
Version 0.4 ----------------------------------------------------------------------------------------------------
I've done some OO evolution of the code, on this version i'm gonna try MYsql fulltext search capabilities,
even if benchmarks around on the net tell mysql fullsearch is slow, i've done some tries and i think it's ok,
i get same benchmark as with innodb but without advanced capabilities on IR applied to index.
-result ordered by ranking!
Version 0.3 ----------------------------------------------------------------------------------------------------
I've changed the structure of the every zones table, i'ved removed the ISOnumeric field, it's the field to identify
the country, but, as the information is inserted on the table named by the country code, i think it could be enough
to take the table number, however I've created a ZoneToCountry table with two fields, it will be usefull to identify
the country a zone comes from.
The second most important change is that Zone_ID will be a unique ID among all the zone tables, so the extraction framework
must assing a unique id to every zone row, i mean unique not just inside the table on where it's but, unique among every
zone table, in these way we have a 32bit, the first target to apply the sphinx indexing capabilities. We plan to build
a view with all the zone tables and to index it with the open source indexing server.
|