Spring Integration Flow with Jdbc Message source which has dynamic queryDynamically create MessageSource on startup spring-integrationSpring Integration: File polling memory consumptionCreate Stream with one source, two parallel processors and one sink in Spring Cloud Data FlowMS SQL CDC with Kafka Connect and Apache Kafkahow to send message to both kafka channel and jdbc with spring integration?Is this Bug in Spring cloud data flow :tab create 2 additional kafka queues and additional spring boot application for bridgeSpring Integration: call service when flow completesSpring integration Java DSL - Dynamically create IntegrationFlowsQueries related to Kafka with Apache Atlas
Is this apparent Class Action settlement a spam message?
Pre-amplifier input protection
Would this custom Sorcerer variant that can only learn any verbal-component-only spell be unbalanced?
How does Loki do this?
Why escape if the_content isnt?
How to check is there any negative term in a large list?
Lay out the Carpet
Is the destination of a commercial flight important for the pilot?
Would a high gravity rocky planet be guaranteed to have an atmosphere?
Failed to fetch jessie backports repository
How do I extract a value from a time formatted value in excel?
How do scammers retract money, while you can’t?
Is there a korbon needed for conversion?
How did Doctor Strange see the winning outcome in Avengers: Infinity War?
Roman Numeral Treatment of Suspensions
Hostile work environment after whistle-blowing on coworker and our boss. What do I do?
Method to test if a number is a perfect power?
How can a function with a hole (removable discontinuity) equal a function with no hole?
Large drywall patch supports
Valid Badminton Score?
Term for the "extreme-extension" version of a straw man fallacy?
How to pronounce the slash sign
Class Action - which options I have?
Why, precisely, is argon used in neutrino experiments?
Spring Integration Flow with Jdbc Message source which has dynamic query
Dynamically create MessageSource on startup spring-integrationSpring Integration: File polling memory consumptionCreate Stream with one source, two parallel processors and one sink in Spring Cloud Data FlowMS SQL CDC with Kafka Connect and Apache Kafkahow to send message to both kafka channel and jdbc with spring integration?Is this Bug in Spring cloud data flow :tab create 2 additional kafka queues and additional spring boot application for bridgeSpring Integration: call service when flow completesSpring integration Java DSL - Dynamically create IntegrationFlowsQueries related to Kafka with Apache Atlas
I am trying to do a change data capture from oracle DB using spring cloud data flow with kafka as broker. I am using polling mechanism for this. I am polling the data base with a basic select query at regular intervals to capture any updated data. For a better fail proof system, I have persisted my last poll time in oracle DB and used it to get the data which is updated after last poll.
public MessageSource<Object> jdbcMessageSource()
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, this.properties.getQuery());
jdbcPollingChannelAdapter.setUpdateSql(this.properties.getUpdate());
return jdbcPollingChannelAdapter;
@Bean
public IntegrationFlow pollingFlow()
IntegrationFlowBuilder flowBuilder = IntegrationFlows.from(jdbcMessageSource(),spec -> spec.poller(Pollers.fixedDelay(3000)));
flowBuilder.channel(this.source.output());
flowBuilder.transform(trans,"transform");
return flowBuilder.get();
My queries in application properties are as below:
query: select * from kafka_test where LAST_UPDATE_TIME >(select LAST_POLL_TIME from poll_time)
update : UPDATE poll_time SET LAST_POLL_TIME = CURRENT_TIMESTAMP
This working perfectly for me. I am able to get the CDC from the DB with this approach.
The problem I am looking over now is below:
Creating an table just to maintain the poll time is an overburden. I am looking for maintaining this last poll time in a kafka topic and retrieve that time from kafka topic when I am making the next poll.
I have modified the jdbcMessageSource
method as below to try that:
public MessageSource<Object> jdbcMessageSource()
String query = "select * from kafka_test where LAST_UPDATE_TIME > '"+<Last poll time value read from kafka comes here>+"'";
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, query);
return jdbcPollingChannelAdapter;
But the Spring Data Flow is instantiating the pollingFlow( ) (please see the code above) bean only once. Hence what ever the query that is run first will remain the same. I want to update the query with new poll time for each poll.
Is there a way where I can write a custom Integrationflow
to have this query updated everytime I make a poll ?
I have tried out IntegrationFlowContext
for that but wasn't successful.
Thanks in advance !!!
spring-boot apache-kafka spring-integration spring-cloud-dataflow oracle-cdc
add a comment |
I am trying to do a change data capture from oracle DB using spring cloud data flow with kafka as broker. I am using polling mechanism for this. I am polling the data base with a basic select query at regular intervals to capture any updated data. For a better fail proof system, I have persisted my last poll time in oracle DB and used it to get the data which is updated after last poll.
public MessageSource<Object> jdbcMessageSource()
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, this.properties.getQuery());
jdbcPollingChannelAdapter.setUpdateSql(this.properties.getUpdate());
return jdbcPollingChannelAdapter;
@Bean
public IntegrationFlow pollingFlow()
IntegrationFlowBuilder flowBuilder = IntegrationFlows.from(jdbcMessageSource(),spec -> spec.poller(Pollers.fixedDelay(3000)));
flowBuilder.channel(this.source.output());
flowBuilder.transform(trans,"transform");
return flowBuilder.get();
My queries in application properties are as below:
query: select * from kafka_test where LAST_UPDATE_TIME >(select LAST_POLL_TIME from poll_time)
update : UPDATE poll_time SET LAST_POLL_TIME = CURRENT_TIMESTAMP
This working perfectly for me. I am able to get the CDC from the DB with this approach.
The problem I am looking over now is below:
Creating an table just to maintain the poll time is an overburden. I am looking for maintaining this last poll time in a kafka topic and retrieve that time from kafka topic when I am making the next poll.
I have modified the jdbcMessageSource
method as below to try that:
public MessageSource<Object> jdbcMessageSource()
String query = "select * from kafka_test where LAST_UPDATE_TIME > '"+<Last poll time value read from kafka comes here>+"'";
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, query);
return jdbcPollingChannelAdapter;
But the Spring Data Flow is instantiating the pollingFlow( ) (please see the code above) bean only once. Hence what ever the query that is run first will remain the same. I want to update the query with new poll time for each poll.
Is there a way where I can write a custom Integrationflow
to have this query updated everytime I make a poll ?
I have tried out IntegrationFlowContext
for that but wasn't successful.
Thanks in advance !!!
spring-boot apache-kafka spring-integration spring-cloud-dataflow oracle-cdc
add a comment |
I am trying to do a change data capture from oracle DB using spring cloud data flow with kafka as broker. I am using polling mechanism for this. I am polling the data base with a basic select query at regular intervals to capture any updated data. For a better fail proof system, I have persisted my last poll time in oracle DB and used it to get the data which is updated after last poll.
public MessageSource<Object> jdbcMessageSource()
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, this.properties.getQuery());
jdbcPollingChannelAdapter.setUpdateSql(this.properties.getUpdate());
return jdbcPollingChannelAdapter;
@Bean
public IntegrationFlow pollingFlow()
IntegrationFlowBuilder flowBuilder = IntegrationFlows.from(jdbcMessageSource(),spec -> spec.poller(Pollers.fixedDelay(3000)));
flowBuilder.channel(this.source.output());
flowBuilder.transform(trans,"transform");
return flowBuilder.get();
My queries in application properties are as below:
query: select * from kafka_test where LAST_UPDATE_TIME >(select LAST_POLL_TIME from poll_time)
update : UPDATE poll_time SET LAST_POLL_TIME = CURRENT_TIMESTAMP
This working perfectly for me. I am able to get the CDC from the DB with this approach.
The problem I am looking over now is below:
Creating an table just to maintain the poll time is an overburden. I am looking for maintaining this last poll time in a kafka topic and retrieve that time from kafka topic when I am making the next poll.
I have modified the jdbcMessageSource
method as below to try that:
public MessageSource<Object> jdbcMessageSource()
String query = "select * from kafka_test where LAST_UPDATE_TIME > '"+<Last poll time value read from kafka comes here>+"'";
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, query);
return jdbcPollingChannelAdapter;
But the Spring Data Flow is instantiating the pollingFlow( ) (please see the code above) bean only once. Hence what ever the query that is run first will remain the same. I want to update the query with new poll time for each poll.
Is there a way where I can write a custom Integrationflow
to have this query updated everytime I make a poll ?
I have tried out IntegrationFlowContext
for that but wasn't successful.
Thanks in advance !!!
spring-boot apache-kafka spring-integration spring-cloud-dataflow oracle-cdc
I am trying to do a change data capture from oracle DB using spring cloud data flow with kafka as broker. I am using polling mechanism for this. I am polling the data base with a basic select query at regular intervals to capture any updated data. For a better fail proof system, I have persisted my last poll time in oracle DB and used it to get the data which is updated after last poll.
public MessageSource<Object> jdbcMessageSource()
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, this.properties.getQuery());
jdbcPollingChannelAdapter.setUpdateSql(this.properties.getUpdate());
return jdbcPollingChannelAdapter;
@Bean
public IntegrationFlow pollingFlow()
IntegrationFlowBuilder flowBuilder = IntegrationFlows.from(jdbcMessageSource(),spec -> spec.poller(Pollers.fixedDelay(3000)));
flowBuilder.channel(this.source.output());
flowBuilder.transform(trans,"transform");
return flowBuilder.get();
My queries in application properties are as below:
query: select * from kafka_test where LAST_UPDATE_TIME >(select LAST_POLL_TIME from poll_time)
update : UPDATE poll_time SET LAST_POLL_TIME = CURRENT_TIMESTAMP
This working perfectly for me. I am able to get the CDC from the DB with this approach.
The problem I am looking over now is below:
Creating an table just to maintain the poll time is an overburden. I am looking for maintaining this last poll time in a kafka topic and retrieve that time from kafka topic when I am making the next poll.
I have modified the jdbcMessageSource
method as below to try that:
public MessageSource<Object> jdbcMessageSource()
String query = "select * from kafka_test where LAST_UPDATE_TIME > '"+<Last poll time value read from kafka comes here>+"'";
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, query);
return jdbcPollingChannelAdapter;
But the Spring Data Flow is instantiating the pollingFlow( ) (please see the code above) bean only once. Hence what ever the query that is run first will remain the same. I want to update the query with new poll time for each poll.
Is there a way where I can write a custom Integrationflow
to have this query updated everytime I make a poll ?
I have tried out IntegrationFlowContext
for that but wasn't successful.
Thanks in advance !!!
spring-boot apache-kafka spring-integration spring-cloud-dataflow oracle-cdc
spring-boot apache-kafka spring-integration spring-cloud-dataflow oracle-cdc
asked Mar 7 at 12:52
Akhil GhatikiAkhil Ghatiki
347512
347512
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
We have this test configuration (sorry, it is an XML):
<inbound-channel-adapter query="select * from item where status=:status" channel="target"
data-source="dataSource" select-sql-parameter-source="parameterSource"
update="delete from item"/>
<beans:bean id="parameterSource" factory-bean="parameterSourceFactory"
factory-method="createParameterSourceNoCache">
<beans:constructor-arg value=""/>
</beans:bean>
<beans:bean id="parameterSourceFactory"
class="org.springframework.integration.jdbc.ExpressionEvaluatingSqlParameterSourceFactory">
<beans:property name="parameterExpressions">
<beans:map>
<beans:entry key="status" value="@statusBean.which()"/>
</beans:map>
</beans:property>
<beans:property name="sqlParameterTypes">
<beans:map>
<beans:entry key="status" value="# T(java.sql.Types).INTEGER"/>
</beans:map>
</beans:property>
</beans:bean>
<beans:bean id="statusBean"
class="org.springframework.integration.jdbc.config.JdbcPollingChannelAdapterParserTests$Status"/>
Pay attention to the ExpressionEvaluatingSqlParameterSourceFactory
and its createParameterSourceNoCache()
factory. The this result can be used for the select-sql-parameter-source
.
The JdbcPollingChannelAdapter
has a setSelectSqlParameterSource
on the matter.
So, you configure a ExpressionEvaluatingSqlParameterSourceFactory
to be able to resolve some query parameter as an expression for some bean method invocation to get a desired value from Kafka. Then createParameterSourceNoCache()
will help you to obtain an expected SqlParameterSource
.
There is some info in docs as well: https://docs.spring.io/spring-integration/docs/current/reference/html/#jdbc-inbound-channel-adapter
add a comment |
See Artem's answer for the mechanism for a dynamic query in the standard adapter; an alternative, however, would be to simply wrap a JdbcTemplate
in a Bean and invoke it with
IntegrationFlows.from(myPojo(), "runQuery", e -> ...)
...
or even a simple lambda
.from(() -> jdbcTemplate...)
See my answer, please,
– Artem Bilan
Mar 7 at 14:27
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55044253%2fspring-integration-flow-with-jdbc-message-source-which-has-dynamic-query%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
We have this test configuration (sorry, it is an XML):
<inbound-channel-adapter query="select * from item where status=:status" channel="target"
data-source="dataSource" select-sql-parameter-source="parameterSource"
update="delete from item"/>
<beans:bean id="parameterSource" factory-bean="parameterSourceFactory"
factory-method="createParameterSourceNoCache">
<beans:constructor-arg value=""/>
</beans:bean>
<beans:bean id="parameterSourceFactory"
class="org.springframework.integration.jdbc.ExpressionEvaluatingSqlParameterSourceFactory">
<beans:property name="parameterExpressions">
<beans:map>
<beans:entry key="status" value="@statusBean.which()"/>
</beans:map>
</beans:property>
<beans:property name="sqlParameterTypes">
<beans:map>
<beans:entry key="status" value="# T(java.sql.Types).INTEGER"/>
</beans:map>
</beans:property>
</beans:bean>
<beans:bean id="statusBean"
class="org.springframework.integration.jdbc.config.JdbcPollingChannelAdapterParserTests$Status"/>
Pay attention to the ExpressionEvaluatingSqlParameterSourceFactory
and its createParameterSourceNoCache()
factory. The this result can be used for the select-sql-parameter-source
.
The JdbcPollingChannelAdapter
has a setSelectSqlParameterSource
on the matter.
So, you configure a ExpressionEvaluatingSqlParameterSourceFactory
to be able to resolve some query parameter as an expression for some bean method invocation to get a desired value from Kafka. Then createParameterSourceNoCache()
will help you to obtain an expected SqlParameterSource
.
There is some info in docs as well: https://docs.spring.io/spring-integration/docs/current/reference/html/#jdbc-inbound-channel-adapter
add a comment |
We have this test configuration (sorry, it is an XML):
<inbound-channel-adapter query="select * from item where status=:status" channel="target"
data-source="dataSource" select-sql-parameter-source="parameterSource"
update="delete from item"/>
<beans:bean id="parameterSource" factory-bean="parameterSourceFactory"
factory-method="createParameterSourceNoCache">
<beans:constructor-arg value=""/>
</beans:bean>
<beans:bean id="parameterSourceFactory"
class="org.springframework.integration.jdbc.ExpressionEvaluatingSqlParameterSourceFactory">
<beans:property name="parameterExpressions">
<beans:map>
<beans:entry key="status" value="@statusBean.which()"/>
</beans:map>
</beans:property>
<beans:property name="sqlParameterTypes">
<beans:map>
<beans:entry key="status" value="# T(java.sql.Types).INTEGER"/>
</beans:map>
</beans:property>
</beans:bean>
<beans:bean id="statusBean"
class="org.springframework.integration.jdbc.config.JdbcPollingChannelAdapterParserTests$Status"/>
Pay attention to the ExpressionEvaluatingSqlParameterSourceFactory
and its createParameterSourceNoCache()
factory. The this result can be used for the select-sql-parameter-source
.
The JdbcPollingChannelAdapter
has a setSelectSqlParameterSource
on the matter.
So, you configure a ExpressionEvaluatingSqlParameterSourceFactory
to be able to resolve some query parameter as an expression for some bean method invocation to get a desired value from Kafka. Then createParameterSourceNoCache()
will help you to obtain an expected SqlParameterSource
.
There is some info in docs as well: https://docs.spring.io/spring-integration/docs/current/reference/html/#jdbc-inbound-channel-adapter
add a comment |
We have this test configuration (sorry, it is an XML):
<inbound-channel-adapter query="select * from item where status=:status" channel="target"
data-source="dataSource" select-sql-parameter-source="parameterSource"
update="delete from item"/>
<beans:bean id="parameterSource" factory-bean="parameterSourceFactory"
factory-method="createParameterSourceNoCache">
<beans:constructor-arg value=""/>
</beans:bean>
<beans:bean id="parameterSourceFactory"
class="org.springframework.integration.jdbc.ExpressionEvaluatingSqlParameterSourceFactory">
<beans:property name="parameterExpressions">
<beans:map>
<beans:entry key="status" value="@statusBean.which()"/>
</beans:map>
</beans:property>
<beans:property name="sqlParameterTypes">
<beans:map>
<beans:entry key="status" value="# T(java.sql.Types).INTEGER"/>
</beans:map>
</beans:property>
</beans:bean>
<beans:bean id="statusBean"
class="org.springframework.integration.jdbc.config.JdbcPollingChannelAdapterParserTests$Status"/>
Pay attention to the ExpressionEvaluatingSqlParameterSourceFactory
and its createParameterSourceNoCache()
factory. The this result can be used for the select-sql-parameter-source
.
The JdbcPollingChannelAdapter
has a setSelectSqlParameterSource
on the matter.
So, you configure a ExpressionEvaluatingSqlParameterSourceFactory
to be able to resolve some query parameter as an expression for some bean method invocation to get a desired value from Kafka. Then createParameterSourceNoCache()
will help you to obtain an expected SqlParameterSource
.
There is some info in docs as well: https://docs.spring.io/spring-integration/docs/current/reference/html/#jdbc-inbound-channel-adapter
We have this test configuration (sorry, it is an XML):
<inbound-channel-adapter query="select * from item where status=:status" channel="target"
data-source="dataSource" select-sql-parameter-source="parameterSource"
update="delete from item"/>
<beans:bean id="parameterSource" factory-bean="parameterSourceFactory"
factory-method="createParameterSourceNoCache">
<beans:constructor-arg value=""/>
</beans:bean>
<beans:bean id="parameterSourceFactory"
class="org.springframework.integration.jdbc.ExpressionEvaluatingSqlParameterSourceFactory">
<beans:property name="parameterExpressions">
<beans:map>
<beans:entry key="status" value="@statusBean.which()"/>
</beans:map>
</beans:property>
<beans:property name="sqlParameterTypes">
<beans:map>
<beans:entry key="status" value="# T(java.sql.Types).INTEGER"/>
</beans:map>
</beans:property>
</beans:bean>
<beans:bean id="statusBean"
class="org.springframework.integration.jdbc.config.JdbcPollingChannelAdapterParserTests$Status"/>
Pay attention to the ExpressionEvaluatingSqlParameterSourceFactory
and its createParameterSourceNoCache()
factory. The this result can be used for the select-sql-parameter-source
.
The JdbcPollingChannelAdapter
has a setSelectSqlParameterSource
on the matter.
So, you configure a ExpressionEvaluatingSqlParameterSourceFactory
to be able to resolve some query parameter as an expression for some bean method invocation to get a desired value from Kafka. Then createParameterSourceNoCache()
will help you to obtain an expected SqlParameterSource
.
There is some info in docs as well: https://docs.spring.io/spring-integration/docs/current/reference/html/#jdbc-inbound-channel-adapter
answered Mar 7 at 14:27
Artem BilanArtem Bilan
67.6k84973
67.6k84973
add a comment |
add a comment |
See Artem's answer for the mechanism for a dynamic query in the standard adapter; an alternative, however, would be to simply wrap a JdbcTemplate
in a Bean and invoke it with
IntegrationFlows.from(myPojo(), "runQuery", e -> ...)
...
or even a simple lambda
.from(() -> jdbcTemplate...)
See my answer, please,
– Artem Bilan
Mar 7 at 14:27
add a comment |
See Artem's answer for the mechanism for a dynamic query in the standard adapter; an alternative, however, would be to simply wrap a JdbcTemplate
in a Bean and invoke it with
IntegrationFlows.from(myPojo(), "runQuery", e -> ...)
...
or even a simple lambda
.from(() -> jdbcTemplate...)
See my answer, please,
– Artem Bilan
Mar 7 at 14:27
add a comment |
See Artem's answer for the mechanism for a dynamic query in the standard adapter; an alternative, however, would be to simply wrap a JdbcTemplate
in a Bean and invoke it with
IntegrationFlows.from(myPojo(), "runQuery", e -> ...)
...
or even a simple lambda
.from(() -> jdbcTemplate...)
See Artem's answer for the mechanism for a dynamic query in the standard adapter; an alternative, however, would be to simply wrap a JdbcTemplate
in a Bean and invoke it with
IntegrationFlows.from(myPojo(), "runQuery", e -> ...)
...
or even a simple lambda
.from(() -> jdbcTemplate...)
edited Mar 7 at 14:34
answered Mar 7 at 14:25
Gary RussellGary Russell
84.4k85077
84.4k85077
See my answer, please,
– Artem Bilan
Mar 7 at 14:27
add a comment |
See my answer, please,
– Artem Bilan
Mar 7 at 14:27
See my answer, please,
– Artem Bilan
Mar 7 at 14:27
See my answer, please,
– Artem Bilan
Mar 7 at 14:27
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55044253%2fspring-integration-flow-with-jdbc-message-source-which-has-dynamic-query%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown