Jump to content

Valtasar

Clients
  • Posts

    227
  • Joined

  • Last visited

 Content Type 

Downloads

Release Notes

IPS4 Guides

IPS4 Developer Documentation

Invision Community Blog

Development Blog

Deprecation Tracker

Providers Directory

Forums

Events

Store

Gallery

Everything posted by Valtasar

  1. Hi! In our board we were using Member 1.0.7 and had created (and still wish to use) some custom markers. Starting from a few weeks ago, the main page stucks when I load membermap, and as I understand this is related to the application not supporting google maps API v3 yet. (@stoo2000: Is this true? I see some part of the javascript code having been converted to v3 API. Can you please explain what forced you to move to BIng maps API? (sorry for asking - I am sure you have been asked this question again!). I then upgraded to 1.0.9 (after seeing that the 2.0.0 beta does not support custom markers yet) but still I get the same "stuck" page. So, (as others ask above) will custom markers be supported soon, or you encountered problems for implementing this in Bing maps API and you think this will be delayed? Keep up the good work!
  2. Many of our board users had many problems viewing large images in IPB3, since the version of Lightbox used does not allow resizing them. I tried this alternative, and works great (although maybe a bit slower), with no code changes needed (except removing the options header) and only two extra options to be added in "include_lightbox_real" skin template. I think it's a nice option to consider for anyone facing the same problem and maybe invisionpower could try to support this in future IPB versions.
  3. Yesterday, after many unsuccessful attempts (reconnect attempts after MySQL server connection fell did not solve the problem), I managed to run to make the script successfully by simply modifying the "max_allowed_packet" server MySQL variable from 1M to 24M! After this, I even set $limit=100 (100 update queries per cycle) and the script ran without problems. I guess another way to do it is by running the query mysql_query("SET max_allowed_packet=24M"); at the start. @media: it is not clear from your output, whether you had the same problem ("MySql server has gone away") or some other timeout. However, as bfarber suggested above, the best it to run the script at home from command line, as it needed about 6hours in my case, for a database with sql dump size of about 250MB.
  4. Hi, I have converted the script to support batch queries, and uploaded it as a Resource (and here). I still get some "MySQL Server has gone away", that I have to fix probably by retrying the query after reconnecting, but otherwise the conversion seems to be quite successful (of course extensive testing of the new IPB setup is now needed, to actually use this). I also had problems with Greek non-utf8 encodings occur since "windows-1253" and "iso-8859-7" have specific differences, but seem to co-exist in the database. Actually the Euro sign (A4) and a few more should probably be added iso-8859-7 in the IPB ConvertTable (I added them in my tests). Still a serious problem exists with character 0xA2 (Accented A symbol) that has different meaning in the two encodings that seem to co-exist and probably has to be solved manually :( Also, I had to ignore some fields (e.g. ibf_gallery_images.metadata that has EXIF headers and ibf_gallery_images.file_name that has chars with encoding I cannot recognize, but is probably not important). One other field already contained utf8 symbols, so I had to write special code in oder NOT to re-convert them! Problems just seem to be un-ending, and I guess more will occur when the database will be used by IPB, so good luck to anyone that will be using this! ipb2_utf8_convert_v1.0.php
  5. First it is strange that important IPB tables (e.g. ibf_topics, ibf_posts and others) that definitely contain Turkish characters are not in the output? I have no idea why the script seems to not even try to convert them. I have a suggestion to see what goes wrong: 1) Uncomment line 87 and provide there a table that definitely contains Turkish characters. Now the script will try to convert THIS table ONLY. 2) Then go to line 132 of the script (printf("in: %s\n",$row[$num_pk]); ) and uncomment it out. 3) Redirect the output to a file, as you already did. If your database and table you selected and the database connection and content is fine, you should now see the correct contents of the specific Table and real iso-8859-9 characters in this file. Good luck. PS: You probably HAVE to write the php script from command line or alter the PHP variable "max_execution_time", because this php script really needs a lot of time to run.
  6. Hi again, You pose two issues: 1) I also thought about including ALTER TABLE commands provided by your script in my script, but I was a bit afraid how MySQL deals with them. Please go to the manual page and search for "For a column that has a data type". The problem mentioned here is that the field types COULD even change (extend) after the ALTER TABLE command (i.e. text->mediumtext, varchar -> mediumtext, etc) to fit the "larger" utf8 data. bfarber, or any other IPB expert please tell us if such field length changes MAY cause any problem to IPB. I was also unsure if the same problem exists when e.g. we use "CREATE TABLE" with utf8 charset (I guess the field size is still 64k bytes, i.e. 64k/3 utf8 chars). So I decided not to include at all such commands in my script, until I know exactly how this should be done :) There is also the probability that different MySQL versions may behave differently on the above issue! Anyway, IMHO your script will probably work. 2) Maybe I do not perfectly understand your question, but you are probably asking which of the two databases will be modified after including conf_global.php script, which contains reference to your REAL (LIVE) database. The answer to this is very clear: ONLY the database you state in $dbname = "bak_forum"; will be affected/converted. In fact you can even comment out (or delete) the 5 "require once" lines just after "require_once( './sources/ipsclass.php' );" and the script will still work! Thus, you will be sure that nothing will be modified in your LIVE database.
  7. Many thanks to all of you for your nice words! Yes, I can probably upload the script to resources, after a few improvements. Well, you seem to want to try this on a backup copy of your database at your server - I think this is possible, although I did it at my copy at home. I do not know the current charset of your database, but you should try to run the script, on a database copy having exactly the same content, however both the database copy and its tables should be created with utf8 encoding. I did this by: 1) Exporting the database (with create tables commands) to an sql script 2) Replacing latin1 to utf8 everywhere in the script (create tables commands and client charset) 3) Creating a new utf8 database 4) Importing the sql script in this new utf8 database Good luck and please tell us if it worked! I hope it works better in your case - as I said in my case some posts/fields were not converted, and I am still checking what went wrong and if I find something I will post it here.
  8. As promised, here is my php script to automatically convert a IPB2 database to utf8. It does not use IPB2-style queries, but seems to work for me, converting most of my iso-8859-7 database to utf8. However, it still fails in a few cases, which I have to investigate in the next days. The reason is probably (as I noticed) that there is some binary content within some text fields (??), which I can only guess are either already utf8-encoded chars, or I don't know what. I will have to write to a file the specific cases where txt_convert_charsets fails, to find out what really happened. As I also explain inside the file, in my case, default charset for remote and local database was utf8 but the database had latin1 charset, latin1_swedish_ci collation and IPB charset was set to 'iso-8859-7'. Hence it was also erroneously displayed in phpmyadmin. When I got my local sql backup, the content was really utf8-encoded latin1 chars (I couldn't get anything better from backup/phpmyadmin)! I then changed all the "CREATE TABLE" and "SET character_set_client" commands from latin1 to utf8, trying to do ABSOLUTELY NO modification to the content of the database, before running the attached script. - Serialization is handled quite nicely I think, with a command I found to change the string lengths (without unserializing). - It is fully automatic - it will examine your database and convert only "char varchar text enum set tinytext mediumtext longtext" fileds. - Skips fields to be converted if they are a primary key. Any ideas on how to convert these? (small problem-I guess they do not contain utf8 strings). - Skips conversion if the table does not have a primary key (odd but there existed such tables in my IPB2 database-not any important ones of course). To run it, just place it at your IPB home directory, and change the database connection details and your source charset. You can optionally convert html entities after utf8 conversion:it worked for me, but it was the opposite of what bfarber says in the above post: html_entity_decode( $out, ENT_NOQUOTES, "utf-8" ) worked fine (php5.2.6, mysql 5.0, windows xp) while mb_convert_encoding($out, 'utf-8', 'HTML-ENTITIES') failed-I really don't know why! Feel free to try it, and to correct/improve it if possible, so that at the end we all manage to convert our databases to utf8! ipb2_utf8_convert.php VERY IMPORTANT: ONLY USE THIS FOR EXPERIMENTATION AND ONLY USING A LOCAL COPY!
  9. Hi, I followed this thread (and the corresponding IPB3 thread) and I am also very much interested in this issue. I attempt to migrate an existing IPB2.3.5 database from iso-8859-7 to utf8. I will try to write such an automatic conversion php script and post it here for everybody to try, based on the IPB3 script you provided and the i18n charset library. However, before I do this, I need to ask you some important questions: 1) Since I haven't studied any IPB3 code, will the IPB2 function: $post = $this->ipsclass->txt_convert_charsets($r['post'] , 'old_char_set', 'utf-8' ); do exactly the same thing as the IPB3 function you used: $post = IPSText::convertCharsets( $r['post'], 'old_char_set', 'utf-8' ); or not? 2) I will automatically get a list of IPB fields to convert. I plan to convert the fields: char varchar text enum set tinytext mediumtext longtext Should I convert some of them? Or maybe there are more to convert? Can you please specify the proper set? 3) In the database I see a lot Greek characters converted to html entities for some reason (as I said iso-8859-7 was used as the charset in IPB) that I assume cause problems in searches. Is this handled by txt_convert_charsets() function? Should I try to convert these entities to utf8 e.g. using a standard PHP function like html_entity_decode() or mb_convert_encoding(), or not? 4) A terrible headache could be text fields written with the serialize() function in IPB, where I guess conversion will fail, since string lengths may change after converting to utf8. Is this true? If so, what is the best way to deal with this, or at least is there a way to automatically identify serialized fields and leave them out of the conversion (maybe only latin characters really exist in such fields)? Many thanks in advance.
×
×
  • Create New...