Max retries exceeded with historical API

chimirala
Hi Team,

I am trying to pull 5min historical data for backtesting for 50 securities. The loops runs for one-security and then I keep getting Max retries error. I have a sleep method which halts for every 2-seconds.

I am not sure whether its worth taking this API. Could you kindly check this.

Note: I am not using for live market. This is specifically for bactesting


  • rakeshr
    @chimirala
    ConnectTimeout
    Seems like timeout during the initial connection. Can you increase timeout while initialization and try?
    kite = KiteConnect(api_key="your_api_key",timeout=10)
  • chimirala
    Hi Rakesh,

    Its still the same. No clue whats the issue. Live API is working fine and this Historical API seems to be a big pain.


  • rakeshr
    @chimirala
    Is this timeout happening for any specific instrument_token?
    Maybe can you paste your historical data fetching code(along with instrument_list)? We will take look at it.
  • chimirala
    Sorry for delay.

    I am looping though list of instruments [around 50] and after one instrument I keep getting this error.

    def get_data(instrument):
    final_df=pd.DataFrame()
    col_names=['Date_Time','Open','High','Low','Close','Volume']


    # print('Getting Data for ' + instrument)
    #since limit is 100 days. looping through for 3000 days
    for i in range(30):
    start_date = dt.datetime.strptime('10-06-2020',"%d-%m-%Y") - dt.timedelta((30-i)*100) #dt.date.today() - dt.timedelta((30-i)*100)
    end_date =dt.datetime.strptime('10-06-2020',"%d-%m-%Y") - dt.timedelta((30-i-1)*100+1) #dt.date.today() - dt.timedelta((30-i-1)*100+1) #+1 at end to get 99 days. 100th day will get in next loop
    data = pd.DataFrame(kite.historical_data(instrument, start_date , end_date, "5minute"))
    # pdb.set_trace()
    sleep(10)
    final_df=final_df.append(data)
    final_df=final_df.reset_index(drop=True)
    sleep(5)
    final_df.columns=col_names
    final_df['Date']=final_df['Date_Time'].apply(lambda x: x.date())
    final_df['Time']=final_df['Date_Time'].apply(lambda x: x.time())

    return final_df

    instrument = 121345

    eg list : [3014145, 121345, 664321, 113921].

    Regards
    BNS
  • chimirala
    Actually, this is strange. If I do individually instrument by instrument, it is working. But If I put in loop, it fails
  • BQ5126
    BQ5126 edited July 2020
    I am facing a similar issue. Running it for a single entry works fine but inside a loop, it gives the following error.
    kiteconnect.exceptions.DataException: Unknown Content-Type (text/html) with response: (b"504 Gateway Time-out\nThe server didn't respond in time.\n\n")

    I have already tried increasing the timeout.
    Also, this cannot be due to some error in code because sometimes on re-running the code, the error disappears.
  • rakeshr
    @BQ5126
    kiteconnect.exceptions.DataException: Unknown Content-Type (text/html) with response: (b"504 Gateway Time-out\nThe server didn't respond in time.\n\n")
    We will look into this. You can handle this exception and re-try again.
  • praneeth1984
    I am also getting this issue now. It never happened in the past. I am using for 15 minute candle-size.
  • rakeshr
    @praneeth1984
    You can handle above DataException and try again with some delay. Eg below:
    from kiteconnect import exceptions
    try:
    # Fetch historical data
    except exceptions.DataException:
    #Retry Historical data Fetch
  • chimirala
    Hi,

    Seriously this is irritating. It works for one-security but when I loop through multiple instruments, it kept throwing connection timeout error. I am fed up with this now. Better to go to other historical data provider and take data directly.
  • rakeshr
    @chimirala
    It works for one-security but when I loop through multiple instruments, it kept throwing connection timeout error
    Historical Data fetch is working fine. There can be multiple reasons for a connection timeout error, it can be requesting server, local internet connection, etc. If a time-out field is already increased and checked. You can try once with different connections and have a look.
  • chimirala
    Hi Rakesh, if its working fine why will I waste my time and yours here. I will now try with Google Cloud Server and update the same.
  • vanshtuli
    Hi @chimirala,

    So I was facing the same issue for quite a while now. After deep analysis, I realized that we get this when we try to access data from BSE. Once I switched to NSE, it was working well.
Sign In or Register to comment.