I have a single Redis server running on a Windows 2008 machine. I have multiple websites that each use the BasicRedisClientManager.
This instance is setup once on app start and set in into the Application[]
store. That way it's reusable across all users. The problem I am having is that one site runs on a dedicated server and the other sites run on localhost where the Redis server is running. One is production and the others are test locations for new work/demonstration. The remote client works just fine but the localhost clients will not communicate with the server and seem to be in a locked/dead state. When I run the site I just get a stack overflow and the site recycles itself. Below is some sample code of the setup I have. I see there is some information on creating locks amongst clients but I'm not sure if this is the route I need to take.
http://www.servicestack.net/docs/redis-client/distributed-locking-with-redis
Sample Code:
protected override void Application_Start(object sender, EventArgs e)
{
/*
* Create basicRedisClientManager for redis.
*/
var redisHost = ConfigurationManager.AppSettings["RedisHostIp"].ToString();
BasicRedisClientManager basicRedisClientManager = new BasicRedisClientManager(redisHost);
Application["BasicRedisClientManager"] = basicRedisClientManager;
.....
}
Then it's use in class constructor
public CacheManager()
{
if (Application["BasicRedisClientManager"] != null)
{
/*local variable*/
basicRedisClientManager = (BasicRedisClientManager)Application["BasicRedisClientManager"];
}
}
Within class using the manager.
if (basicRedisClientManager != null)
{
using (var redisClient = (RedisClient)basicRedisClientManager.GetClient())
{
if (!saveToInProcOnly && redisClient != null)
{
using (var redis = redisClient.GetTypedClient<T>())
{
redis.SetEntry(key, value);
}
}
}
}
This logic is the same across all sites.
Any insight is much appreciated!