Our team has a spike sprint to choose between ActiveMQ or RabbitMQ. We have 2 small manufacturers / consumer spikes on 16 strings , A timestamp, and an object message with an array of 2 integers. Spike is fine on our god machines (messages are being consumed well).
Then came the bench. We first saw that at some time, on our machines, when we were sending many messages, consumers were sometimes hanging. It was there, but the queue was being stored in the queue.
When we went to the bench platform:
- 2 rabbit machines cluster 4 core / 3.2 GHz, 4GB RAM, load balanced by a VIP
- One of the six consumers running on rabbits machines, saves messages in one MYSQL dB (same type of machine for DB)
- 12 manufacturers are running 12 AS machine (Tomcat), another machine Attacked with Jmtter running. About 600 to 700 hppe requests per second, which is on servlets, which creates the same load of rebitmak messages.
We have found that occasionally , hang users (well, they are not blocked, but they no longer have messages We can see that each consumer delivers about 100 messages / second in the database, so when a user stops the user, the total message per second in DB falls by the same proportion (if 3 If we stop consumers then we send about 600 messages / seconds These 300 messages / seconds).
During that time, the producers are fine, and still produce the jmeter rate (about 600 msg / sec). Messages are in queue and consumers have still been "alive".
We load all servlets with producers first, then launch all consumers one by one, it is known that the connections are OK, then run Jmtr.
We are sending a message to direct exchange listening to a continuous queue surrounded by all consumer exchanges.
This is the key to our choice. Have you seen it with a rabbit, do you know what is happening?
Thank you for your reply.
This is always worth prefetch counting when using basic.consume:
channel.basicQos (100); Before the channel. BasicConsume line to ensure that you do not have more than 100 messages queued in the queuing consumer.
Comments
Post a Comment