Updated: Comment #45

Problem/Motivation

HEAD is often broken on MySQL/MyISAM, SQLite and PostgreSQL as the testbot does not test under these environments.

Proposed resolution

Test on all major DBs instead of just MySQL.

Remaining tasks

A patch for PostgreSQL already exists, but does not seem to apply.

  • Find servers to do the testing on (#1 - #12) Complete
  • Consider only testing on commit and and not testing patches on "secondary" databases (#11 - #12 and #37) Alternate environment tests can be run on-demand via qa.d.o
  • Consider generalizing the PostgreSQL code for SQLite, or just create separate SQLite code. (#24 - #27 Dedicated core sqlite and postgres bots are now set up
  • Eliminate the dependency on all environments completing before results returning (#39) Hacky, but we're getting around this by tricking qa.d.o into thinking these tests are for a different project client.
  • #2152443: Allow settings.php to specify the table engine is blocking creation of a dedicated MyISAM testbot
  • Concerns in #32
    • 1. Multiple PIFR Environments was tried out when Coder reviews were deployed, but contributed to instability of the testing infrastructure. The system currently does not scale up well with multiple environments, as all environments need to complete testing before a final test result is passed back to drupal.org. We do have advisory tests working for 'Coder' tests, which should apply across to different environments instead of plugins. (See #830838: Allow configurable "Advisory" reviews) Hacky, but we're getting around this by tricking qa.d.o into thinking these tests are for a different project client.
    • 2. The non-mysql databases don't pass cleanly yet. See https://drupal.org/comment/8252251#comment-8252251

#830838: Allow configurable "Advisory" reviews
#1371054: Create testbots for pgsql, sqlite

Would have been caught by this

#1998366: [meta] SQLite is broken
#1987262: [Installation Error] Symfony: "The service definition 'request' does not exist."

Original report by hass in the d.o Infrastructure Queue

Is there any chance that patches are tested against all supported databases and not only mysql with innodb?

Support from Acquia helps fund testing for Drupal Acquia logo

Comments

Dave Reid’s picture

Status: Active » Postponed

There's nothing stopping us from running multi-db tests except for the fact that testing requires not just your everyday, spare computer. We need people that have capable servers to be able to donate/manage test clients. Until that happens on a larger scale, there's really not much we can do. Marking this as postponed.

See:
http://qa.drupal.org/PIFR-2-Client-Configuration-FAQs
http://qa.drupal.org/node/62

hass’s picture

For me it's very important to have the tests running on PgSQL as I do not have this environment and I also believe it would be good if at least core will be tested against PgSQL or bugs may slip into the final D7 release. I'm not using PG myself - but I think there are users that do and it would also help me if contrib will be checked against PG... if there is a beta for contrib I believe there must be enough CPU power available to handle at least the double load for core checks on PG. Ms IIS testing is also important...

Dave Reid’s picture

Yep, we all know its important. It's just a matter of people providing the computing power to make it possible. You'd actually also be surprised at the requirements (linked above) of running a test bot because tests run concurrently and are expected to take about 15 minutes. For PostgreSQL I'll bet it will need more since it's slower on tests.

pwolanin’s picture

The cost of adding 2 servers for postgres would be roughly $8-10/day (2x AWS m1.large instances at spot pricing ~$0.12-0.18/hr or c1.medium at $0.08-$0.10 per hour in US-EAST-1)

Plus the cost of someone figuring out how to make a properly configured AMI for postgres testing. So - can we get a ~$2000-$3000 donation?

Or - someone would need to donate and maintain some similar amount of computing power.

boombatower’s picture

Subscribe.

hass’s picture

Don't use this suxxx expensive AWS stuff. You can rent a dedicated server at 1&1, strato, etc. for 10-40€ per month. AWS is one of the worst and most expensive service I've ever used. Completly broken service! AWS EC2 only suxxx and you can never trust that a machine comes up if you have shutted it down or rebooted it.

I got all our money refunded we wasted in december when i used AWS pre alpha buggy services for the very first time. I do not understand why someone use this sh**.

Gerhard Killesreiter’s picture

I share your sentiments about cloud stuff, however the servers you get for the 10-40€ are not the servers you can use to run tests. I've also not seen servers for less than 30€.

hass’s picture

What's wrong with 40€/month servers in comparison to much slower machines at amazon costing this per day? I rented windows machines with sql servers and such a server would have cost 800€/month with only 2 cores... My dual quad pc have more horse power... If you need a test machine for 2 hours per month only AWS is an option, otherwise buy a pc and you know for sure it is running...

Cannot say how stupid an amazon AWS customer must be...

Josh Waihi’s picture

sorry I should have comment what I know earlier (only just got Peter's messages on IRC now).

Without altering the code base to be able to cache dumps of a standard install at the beginning of a simpletests run, I fear PostgreSQL is simply too slow to put in the test environment compared to MySQL. The biggest overhead seems to be in the setup() call where Drupal is installed. I think it would be faster to (A) use PostgreSQL schemas and (B) restore from a database dump rather than execute code to install Drupal. The schema would provide each test with its own namespace like db_prefixing would.

Talking to Jimmy, this isn't going to happen for Drupal 7.

pwolanin’s picture

If we can't run tests for every patch, perhaps we can have a tesbot that at least does regular checks against clean HEAD for Postgres (and sqlite3)?

In that case speed would be less of an issue.

boombatower’s picture

Currently running on commit would be the best trigger.

Josh Waihi’s picture

hmm, several commits a day, even that would be slow, once a day would be good be ok by me.

hass’s picture

I'm wondering why pgsql and sqlite support is developed if this database types are sooo awfull slow that we need to think so much about processing time... strange...

apaderno’s picture

PostgreSQL and SQLite have nothing wrong in the normal execution environment.

What qa.drupal.org is supposed to do is to test all the patches filled for Drupal core code, and (even if it still in beta testing) patches filled for third-party modules. This means that there are many patches that could be tested, and for each test the site needs to create the envinronment, which also mean to create the database tables used by Drupal, or from the module being tested.

Josh Waihi’s picture

@hass, maybe you haven't been outside the world of MySQL, but not all databases are the same. Thats the whole reason why there are different ones. However, Drupal development is done by a 80% MySQL user community, few of which keep in mind that Drupal runs on different databases. As a result, things like this occur, where we build something to work fine for MySQL then only care about the other databases later. A shame indeed, but it is the way it is.

This particular scenario, it would seem that things that normally seldomly happen to a database like CREATE, DROP, ALTER operations are slowing PostgreSQL down. From what I can see, PostgreSQL is slower on these operations. Especially when comparing MyISAM to PostgreSQL because PostgreSQL is relational. No other way about it.

SQLite on the otherhand is a lightweight database. I'm not sure how we'd expect it to handle but I would imagine it will struggle compared to PostgreSQL and MySQL regardless.

Damien Tournoud’s picture

@hass please stop talking about stuff you have no clue about (er. please stop talking).

I'm in agreement with pwolanin (#10), as I said a few times already. There is very little added value in testing (and retesting) every patch on every architecture. Once every commit (capped at once a day) would be largely enough. We just need to figure out how to report those test results (because if we do that on commit, there is no issue to report back to).

hass’s picture

@Damien: I only asked why pgsql should be that slooooooow. Many people think it's the "better" aka a "real" SQL server what mysql should not, but I do not like to discuss this. I'm not using pg myself - but this is what I get told very often if people compare mysql and pgsql. There could be database abstraction layer bugs that only show up if modules implement specific things and that may easily found by tests. I know that more than 80% develop here for mysql and do not care about pgsql, but db specific data type bugs have been missed in past very often. This is why I think it's time to run the tests. No idea why I should not ask to learn - we still have db_query that allows to execute bad statments nevertheless we have an db abstraction layer or not. You may have not understood this intention.

pwolanin’s picture

@Damien - seems like this is somehow a feature the PIFR master could handle and show on the status page? E.g. a daily check against each platform?

boombatower’s picture

Everything that is needed is supported. View the HEAD test that is run on commit and it will have all environments it was tested on.

pwolanin’s picture

@boombatower - so is there a way to have some environments run the HEAD test and no patch tests?

boombatower’s picture

Yes.

rfay’s picture

Status: Postponed » Active
FileSize
9.11 KB

@Josh Waihi is hard at work on this. Here's his latest work to make pgsql work in the testbot.

Josh Waihi’s picture

FileSize
3.76 KB

Here is a better version using PDO instead. (less code).

Damien Tournoud’s picture

PDO is definitely the way to go, and the code is very similar to what we do in SQLite). Care to merge the two?

Josh Waihi’s picture

Merge the two database drivers? is that such a good idea?

Damien Tournoud’s picture

I meant: implement an abstract PDO-based database engine support class (that's probably 90% of the code), and implement the specific code in child classes.

Josh Waihi’s picture

Maybe there are too many differences. For one thing, PostgreSQL needs two connections because a connection has to be made to a database rather than just the server. One connection is needed to CREATE and DROP the drupal_checkout database and the other connection is needed to connect to the drupal_checkout database. SQLite has the advantage or just unlinking a file or creating a file on connection.

I think its not needed for now. Can we focus on getting a PostgreSQL test slave up then think about refactoring code?

boombatower’s picture

Not sure what is up with the patch, but does not apply cleanly even after removing gitignore crap and running with -p5. Please generate it from pifr as the root and see that it is against the latest CVS.

rfay’s picture

@boombatower, it hasn't been possible to work against latest HEAD because of the problems there. This is against 6.x-2.2 (I believe). I'll leave this for Josh though.

boombatower’s picture

I'll try and work my way through the pifr issue queue...trying to get back in the game after a month of crazy life...got SimpleTest 7.x-2.x back up to speed though..so making progress. :)

hass’s picture

Any chance to get a test server for all databases supported by D7?

rfay’s picture

There are really two classes of things blocking us from running on all databases:

1. PIFR Environments have never really been deployed, and we need to be able to handle them. (Is it a complete fail if one environment fails? Or is that just advisory?) See #830838: Allow configurable "Advisory" reviews.

2. If I understand correctly, the non-mysql databases don't pass cleanly yet. We're looking the other way right now.

I think if we can get these problems resolved we can probably come up with the resources required, but these have been major blockers.

hass’s picture

Priority: Normal » Major

Any news on this issue?

rfay’s picture

Project: Drupal.org infrastructure » Drupal.org Testbots
Component: qa.drupal.org » Miscellaneous

Moving to testbots queue, where it's a dup...

apaderno’s picture

Title: Run tests on all supported database servers / engines » Run tests on all supported database servers/engines
hass’s picture

Thx, closed the newer one

jthorson’s picture

Priority: Major » Normal

If the intent is to test only D8 Head against Postgres, and only on commit, then we currently have this ability if we were to build a new testbot. However, note that this means we would be adding a third environment to the testing infrastructure (in addition to the existing 'Simpletest MySQL' and 'Code Review' environments, and test results are not returned until all environments have completed testing; so if postgres tests run slower, this will result in a delay in response time for MySQL branch tests as well.

My personal opinion is that we should investigate an architectural change which would make PIFT more aware of available environments on a test-by-test bases ... by making PIFT environment aware, we could request and report results individually for each test/environment combination; thus decoupling the current inter-environment dependencies (i.e. the need for all tests to complete before feeding a result back to d.o). With this in place, it wouldn't matter how long postgres takes to complete a test suite ... it could be requested (or returned) seperately from a parallel mysql test request.

This environment awareness in PIFT would also open the door for new d.o. UI elements intended for triggering specific test capabilities without triggering *all* environments to run. (Why wait for a 45 minute simpletest run, when all you want is the 3 minute Code Review results?)

EDIT: Ooops ... forgot to make my main point ... This would add a third environment to the testbot infrastructure, and all of the testbot instability over the last three months (reconfirmation fails, mixed up MySQL/Coder tabs, and 'failed: @ reason' results) were related to underlying environment race conditions present in the code, but which were never discovered until we turned on the second environment with advisory coder reviews. Because of the sheer amount of time that went into tracking these down and stabilizing the testbots, I'd be somewhat reluctant to introduce a third environment at this time.

hass’s picture

Priority: Normal » Major

When I opened the case my main intention was to have automated tests on all systems drupal supports. I never made one test on postgre nor sqllite myself as I do not have access to those environments and I'd like to know; if my modules work on them. It's not ok, that we don't run tests on these platforms as it's known that things may be broken. Today we wait intentionally until someone's site is broken!?! Ahhm what??? This is not acceptable.

jthorson’s picture

Priority: Major » Normal

And when I bumped the priority down to normal, it was with the explanation above that I have zero intention of introducing another environment to the mix at this point in time; given the stability issues we had after rolling out the coder environment and the number of hidden environment-related bugs that it uncovered.

You may consider it major that we cover off testing on these other supported environments, but I consider it major that we don't screw with something that took a hell of a lot of work to get stable again after the last time we screwed with it. And frankly, unless I start seeing someone else posting patches to the PIFR queue, I'm not afraid to claim 'trump' on this one.

I'm more than happy to revisit this post-DrupalCon ... but preferably not before we've also eliminated the dependency for all environments to complete testing before a single result is returned, as I outlined above.

hass’s picture

I don't understand how you can argue that a broken/buggy test robot is an argument for not doing tests on all environments we support. I have no idea what was unstable nor what happened in past 3 months, but this is no good argument to stay away from testing. This way we could also argue to disable mysql testing.

rfay’s picture

While supporting multiple environments is a valuable goal, and we'd like to get there someday, we must keep what we have working. We also don't currently have the infrastructure (extra hosts) required to multiply the number of environments.

As per #39 though, if other people want to invest their time in the manner @jthorson has, well, we'll be happy to teach them. But without more people contributing and reviewing, it's not going to happen. Stability and basic functionality will remain the key goals.

jthorson’s picture

I don't understand how you can argue that a broken/buggy test robot is an argument for not doing tests on all environments we support.

If that was my argument, then I'd have marked this 'closed(won't fix)'. Putting words in other people's mouths isn't going to help your cause ... but putting code into the issue queue will.

The PIFR environment setup needs some tweaks ... perhaps going so far as to separate out 'environments' from 'plugins' so that we don't need to list each and every matrix'ed combination of the two as a separate environment. Once these tweaks are in, we'll be in a much better situation to support multiple underlying environments (and additional testing plugins) via the testing infrastructure.

boombatower’s picture

Not to bring this all up again, but my original plans with the new system...which is now in limbo...would have placed the environment control in PIFT for this reason....I think I need to try again with the whole d.o improvements thing and see if we can get some money behind implementing the new system...open sourced of course...haven't had time to get it all published and what not. This whole mess if frustrating, but not much I seem to be able to do about it.

Hmm...longing for the days of high school when I could just spend all my extra time on this.

@hass: if someone volunteered to put in the significant commitment to complete the work necessary to make this happen cleanly, test it, and deploy it....it would be worth considering, but until then it is simply not feasible.

Liam Morland’s picture

Issue tags: +PostgreSQL

Tagging.

hass’s picture

Issue tags: +sqlite
Gaelan’s picture

Issue summary: View changes

Use the issue summary template.

Gaelan’s picture

Updated issue summary.

Gaelan’s picture

Issue summary: View changes

Add remaining tasks.

Gaelan’s picture

Issue summary: View changes

Add related issues.

hass’s picture

Any news on this issue?

hass’s picture

Issue summary: View changes

Mention original queue

Gaelan’s picture

Should any of the remaining tasks become sub-issues?

jthorson’s picture

Re: #48:

<jthorson> I could do a 'head only' postgres test setup, but every core commit would take twice the bot resources to test, and delay our HEAD test results.
<jthorson> Gaelen: The concerns in #32 is taken care of (we have 'advisory reviews' now) ... but not deeply tested.  #39 is the show-stopper, but I don't think there's a specific issue yet.  Testing only on commit is already feasible ... and finding servers to do the testing is currently under review/evaluation for all testbots, not just postgres.
jthorson’s picture

Issue summary: View changes

Add "would have been caught" section. —Gaelan

YesCT’s picture

Issue summary: View changes

added details of 32, marked one done.

catch’s picture

I could do a 'head only' postgres test setup, but every core commit would take twice the bot resources to test, and delay our HEAD test results.

That would be completely fine with me - even if it takes 5 hours to test each commit, at least we have a 5 hour window for breakage instead of the current window of potentially weeks or months between particular database engines or PHP versions being broken.

edit: even if it takes 24 hours.

webchick’s picture

Here, here! :)

Btw, catch and I brainstormed about this a bit tonight. Here's the list of environments which would cover all the current criticals in the D8 queue:

- PHP 5.3 + MySQL (whatever we're running now; it catches lots of stuff)
- PHP 5.4
- PHP 5.5
- MySQL + MyISAM engine
- PostgreSQL (whatever the latest version is)
- SQLite (whatever the latest version is)

This would probably extend the length of core commit testing to 6-9 hours rather than 1. But the trade-off of not getting hit with critical bugs weeks or months after the fact seems very worth it.

jthorson, I loudly hear your concerns re #39 and you having too much on your shoulders. I think the main issue is the people who care the most about this are neck-deep in D8 atm. Would some additional funding help with this situation at all? Feel free to PM me if so.

jthorson’s picture

Actually, we've increased our OSUOSL capacity by 50% in order to be able to start this without stealing resources from the existing queue ... the "current" problem is that OSUOSL apparently has a bug which is preventing them from being able to create new Virtual Machines (even for them!)

We got some good help from andypost and the CIS sprint last weekend, as they worked on updating our puppet scripts for other platforms (so we don't have to build them manually).

I'll be working on this at MWDS on Thursday (and using AWS if Supercell is still flaking out) ... but for the moment, I'm in tourist mode. ;)

jthorson’s picture

Also, consider the implications if every random test failure on HEAD were to take 9 hours to resolve and restart testing of D8 patches instead of 90 ... we will need to break the 'all environments' dependency before this is truly feasible.

jthorson’s picture

Issue summary: View changes

Updated issue summary.

jthorson’s picture

Issue summary: View changes
jthorson’s picture

Issue summary: View changes
hass’s picture

Is this working for contrib, too? It was the reason why I opened this case...

jthorson’s picture

Core only right now, branch only (no patches), and no commit-triggered tests. (Wanted to provide an update, but haven't closed the issue.)

bzrudi71’s picture

Seems that there is a bug with testbot and PosgtreSQL which is not related to #2001350: [meta] Drupal cannot be installed on PostgreSQL as the connection can't be established to PG instance at all. From the logs in PIFR it seems the problem is that bot tries to connect to PG via MySQL ;-)
drush si -y --db-url=mysql://drupaltestbot-my:4YafRX7K8xCp@localhost/drupaltestbotmysql

Which leads to a access denied and should be fixed first before taking care of #2001350: [meta] Drupal cannot be installed on PostgreSQL. Thanks!

jthorson’s picture

Thanks for the report ... my guess is that puppet (which is supposed to be completely disabled) has somehow re-installed the drupaltestbot-mysql package.

I'll try to take a look at this at some point over the Christmas break.

bzrudi71’s picture

@jthorson Seems you fixed the issue from #58 while reading through the bot logs - thank you!
As of today all patches to make Drupal 8 (hopefully) install again on PostgreSQL are in :-) Now I'm really interested in Testbot results so we can start fixing issues. However, it seems that the PostgreSQL Testbot is currently not running regulary?
From the logs:
[20:38:02] Log initialized (/var/lib/drupaltestbot/sites/default/files/review.test_600303.pifr_simpletest.1391225882.log)
[20:38:02] Review timestamp: (Fri, 01/31/2014 - 20:38)
[20:38:02] Review host: (drupaltestbot1838)

Can you ask Mr. Testbot to start another try please :-)

jthorson’s picture

I've kicked off another run on the dedicated Postgres testbot ... that said, I haven't actually gone back and looked at the build as I said I would in #59 ... adding that to my (actually tracked) to-do list now.

EDIT: Looks like installation completed successfully, and the tests are now running. However, the assertions are not updating on the testbot status page, so we may have some more debugging to do. That said, it's definitely a step forward!

bzrudi71’s picture

Yes, it's a step forward and thanks for your support!
From the logs I found this:
exception 'PDOException' with message 'SQLSTATE[53100]: Disk full: 7 ERROR: could not extend file "base/2540263/2542211": No space left on device

Hmmh ;-)

jthorson’s picture

I see that ... and it's confusing, since the disk is only at 85% utilization.

I'll have to see if I can build up another bot with some remote debugging, and try to sort out why the assertion counts aren't going up ... the disk full may be somehow related.

mradcliffe’s picture

I was a little confused because I thought #2181291: Prevent a query from aborting the entire transaction in pgsql was necessary, but then I realized that it effects only the Standard install profile and not the minimal (and testing, I guess) profiles.

Awesome.

bzrudi71’s picture

@jthorson knowing you were busy deploying the MySQL PHP 5.4 environments the last weeks. But as we get more and more PostgreSQL patches in I wonder when we could expect a working PHP 5.4 PostgreSQL bot environment, any idea? Thanks!

xjm’s picture

Category: Feature request » Task
Priority: Normal » Critical

#2270917: Limit collection name due to MyISAM restrictions and test ConfigCollectionInfo class is the latest bug that would have been caught by this. This is probably critical to support the release of 8.0 without these regular regressions.

xjm’s picture

Can someone confirm whether #2152443: Allow settings.php to specify the table engine is a hard blocker for MyISAM support?

catch’s picture

Issue summary: View changes
catch’s picture

stefan.r’s picture

@jthorson there are some D7 PostgreSQL patches I would like to see tested, so I was wondering if there is there anything I can do to help get the PostgreSQL bot up and running again? Also is it possible to request a specific Postgres core patch to be tested or can we only test patches that are already committed?

Perhaps there is a downloadable Vagrant image of the bot so I can help troubleshoot the "out of disk space" errors if these are still showing up?

bzrudi71’s picture

In addition to the 'out of disk' error we have the next problem as Drupal 8 requires PHP 5.4 now and bot is still on PHP 5.3. Having a Vagrant image to work on could be of help. Mradcliffe 'played' with a PG Vagrant image, but unsure if it's based on the same environment as the Drupal bots?

I've been working on a vagrant repository with both mysql and postgresql installed for core developers. I'm not familiar enough with puppet or ruby to write my own so I forked a puphpet repo and started hacking away. I finally got a box up and running after fixing some puppetlabs postgresql issues. I still need to hack in postgresql configuration optimization based on VM memory. Currently spins up a 4GB VM.

mradcliffe’s picture

The testbots run off of drupaltestbot-puppet (I think). This creates a highly-specialized testing environment rather than one to develop off of, and there is a documentation page regarding how to run a Drupal testbot off of Vagrant. I am not familiar with the pgsql stuff @jthorson was working on.

andypost’s picture

where I can find a postgresql testbot code?
as infrastructure goes to new testbots suppose it's time to test pg.

jthorson’s picture

Documentation is pretty light still, but you can perform local postgres testing using docker using the drupalci_testbot project.

bzrudi71’s picture

@andypost When testing, you need to apply at least the patch from #2181291: Prevent a query from aborting the entire transaction in pgsql to get actual results. Testbot is currently also affected by #2031261: Make SQLite faster by combining multiple inserts and updates in a single query. Both patches can be applied by DCI_PATCH variable during test runs. Please note that there is also a bootable LiveTestBot image available (thanks to @ricardoarmado)

jthorson’s picture

Goal for this week: Get temporary environments up for running daily builds of each environment.

jthorson’s picture

Issue tags: +Amsterdam2014
ricardoamaro’s picture

Assigned: Unassigned » ricardoamaro

Working on creating a site to tests on all supported database servers/engines

jthorson’s picture

Initial testing matrix:

D7 + PHP 5.4 + pgsql
D8 + PHP 5.4 + mysql
D8 + PHP 5.4 + pgsql
D8 + PHP 5.5 + mysql
D8 + PHP 5.6 + mysql

andypost’s picture

any reason sqlite skipped?

ricardoamaro’s picture

@andypost is this better?

D7 + PHP 5.4 + pgsql
D8 + PHP 5.4 + mysql
D8 + PHP 5.4 + pgsql
D8 + PHP 5.4 + sqlite
D8 + PHP 5.5 + mysql
D8 + PHP 5.6 + mysql

Due to the fact the tests take a long time we should make an effort to keep it "imperative to have" basis.

ricardoamaro’s picture

I'm almost finished building an Alpha version on: http://coreresults.drupal-pt.org/
Have a look and feedback, please

stefan.r’s picture

As per #2276809: Solve PostgreSQL DDL slowliness pgsql tests run almost as fast as mysql now

alexpott’s picture

Lovely work - the issue with the installertestbase tests is this #2302799: InstallerTestBase tests can not be run locally. The work around is to have the webserver and php run by the same user.

YesCT’s picture

adding "infrastructure blocker for Drupal 8.0.0" tag, since this blocks the drupal 8 core issue: #2267715: [meta] Drupal.org (websites/infra) blockers to a Drupal 8 RC1, so that the infrastructure blockers to d8 release (in issue queues outside of the core queue) is accurate. Remove the blocker tag when this issue is fixed, and update 2267715.

ricardoamaro’s picture

Assigned: ricardoamaro » Unassigned
drumm’s picture

The issue summary mentions MyISAM, but support was dropped in #2275535: [policy, no patch] Drop support for non-transactional database engines (including MyISAM). Where is the canonical list of must-be-tested and would-be-nice-to-test DB engines? Is #81 accurate?

catch’s picture

Don't think there's a canonical list anywhere.

I think I added MyISAM support to the issue summary here when I opened #2275535: [policy, no patch] Drop support for non-transactional database engines (including MyISAM) as an either/or. Definitely not needed for 8.x testing now it's dropped.

A recent stable release of MySQL (or in practice I think it's currently MariaDB on the current infra), SQLite and PostgreSQL should be fine for the must have - would catch most things.

Nice to have,
- 'oldest supported release' of the three. I can't think of a recent issue where this would have helped but #2348931: Use native MySQL statement preparation via PDO is in the general area where the oldest supported version would matter.

- Drizzle, MongoDB.

xjm’s picture

I think #81 covers what needs to be tested (not just the specific databases, but also the PHP versions). Though I'm not sure why it lists D7 as tested on only pgsql and not mysql; is that a mistake?

isntall’s picture

Project: Drupal.org Testbots » DrupalCI: Dispatcher (Modernizing Testbot Initiative)
andypost’s picture

only postgres 9.4 is missing now

webchick’s picture

Status: Active » Fixed

DrupalCI currently supports the following environments:

PHP 5.4 & MySQL 5.5
PHP 5.5 & MySQL 5.5
PHP 5.6 & MySQL 5.5
PHP 7 & MySQL 5.5
PHP 5.4 & SQLite 3.8
PHP 5.5 & SQLite 3.8
PHP 5.5 & PostgreSQL 9.1

While #2467925: [plan] Remove blockers to the "minimum viable" state of DrupalCI to ship Drupal 8 RC1 also lists PostgreSQL 9.4, I can't think of a reason why that would be anything but a nice to have, since any PostgreSQL testing is light years beyond our previous capabilities.

The one environment missing from the "must have" list of #51 is MyISAM. Since we removed MyISAM support from D8 in #2275535: [policy, no patch] Drop support for non-transactional database engines (including MyISAM) however, this is no longer a blocker to Drupal 8. Spun off a sub-issue at #2551257: Allow running tests on MySQL MyISAM engine about that specifically.

So given that in #88 catch says "A recent stable release of MySQL (or in practice I think it's currently MariaDB on the current infra), SQLite and PostgreSQL should be fine for the must have - would catch most things." I actually think we are done here.

Other things in the "nice to have" list there (oldest supported version of each, Drizzle, MongoDB) we can always open up dedicated follow-up issues for if someone is passionate about those and wants to do the work.

I therefore think we can close this one out, certainly from the standpoint of blocking 8.0.0. Hooray!!!

webchick’s picture

Title: Run tests on all supported database servers/engines » Run tests on all supported database servers

Status: Fixed » Closed (fixed)

Automatically closed - issue fixed for 2 weeks with no activity.