i subscribed to 300 stocks but got tick for only 80-90 stocks. i start recording before market open till 9:16. Some mistake with my code or understanding of API
from kiteconnect import KiteConnect from kiteconnect import KiteTicker import datetime import pandas as pd import threading import import_ipynb from queue import Queue
i subscribed to 300 stocks but got tick for only 80-90 stocks. i start recording before market open till 9:16. Some mistake with my code or understanding of API.
It can happen that not all 300 stocks trades/ticks at the same time/every second.So, at any point of time, it can happen that not all 300 stock will tick and most of the time,you will see lesser number of stocks ticking.
i recommend you to try fetching using getQuote for the instruments which you did not receive the ticks.with Rate limit in place you can write loopback codes until all your scripts have received first tick for the day...... 9:15 will be flooded with traffic, GET should solve your issues and continue with your strategy.
I wanted to find the reason of missing packets as later i need to make candles also. If i am dropping packets there would be huge mismatch in data. as i am new in python. i think there might be a silly mistake with code.
In two different jupyter notebook i ran the same code with same instrument token at the same time. one received ticks for 80 stocks other received ticks for 86 stocks, only having 28 stocks in common. So, i am dropping packets somehow.
if you are new to Python do not attempt to candle the data until you know how to deal with ticks.i recommend you to get around API, data structure and OMS first.Once you are through with that you wont need Candle,Candle is biggest disgrace to API trading.
Since you are specific about complete data, only other solution might be using historical API data. If you do wish to procure you need to check these, time taken for "a" script to update in df, store data in temporary buffer and then push them to df ,this will ensure those few packets are not missed, your NIC speed-try changing to FULL duplex 1GB/s, Your processor-?
@ZI4453 Can you help me with the process for - " If you do wish to procure you need to check these, time taken for "a" script to update in df, store data in temporary buffer and then push them to df ,this will ensure those few packets are not missed, your NIC speed-try changing to FULL duplex 1GB/s "
My Bad its ' if you do notwish to procure data API!, LAN Speed Since storing in temp requires bit of extra coding you will have to dig bit deep into codes, regarding timing you need to store time data before moving data to data frame, once its moved then check the time difference. There are multiple ways to reduce computation, i have gone through few options and finally settled with sqllite, but my requirements are totally different. as mentioned prior start with the basic and getting into Candling is unnecessary waste of time. 99% traders believe that candle is god for trading but its not.. You can do lot more trading than traditional methods if you know coding, numbers,API and how market works.
My Bad its ' if you do notwish to procure data API!, LAN Speed Since storing in temp requires bit of extra coding you will have to dig bit deep into codes, regarding timing you need to store time data before moving data to data frame, once its moved then check the time difference. There are multiple ways to reduce computation, i have gone through few options and finally settled with sqllite, but my requirements are totally different. as mentioned prior start with the basic and getting into Candling is unnecessary waste of time. 99% traders believe that candle is god for trading but its not.. You can do lot more trading than traditional methods if you know coding, numbers,API and how market works.
@ZI4453 Thank you so much for your help. Can you suggest a course or read through which i can dig more into coding and learn about it. i have been struggling with it from a long time. Earlier i had only backtested using historical data but working with an API is totally different. Thank You
review our API documentation, if you are mediocre in REST API you should be able to pick up and its has almost everything you need. Try API trading with one or two stocks, Try Dict instead of Df, Try to keep your strategy to prevent losses, By the time you are "live" you would have figured out most of the things you need. Any theories or strategies available in Opensources might be Rubbish.If all those theories are followed and generating profit i am sure every trader should earn in 6 figures. Learn market, Order execution flow,data structure and most of all REST API!
Some mistake with my code or understanding of API
from kiteconnect import KiteConnect
from kiteconnect import KiteTicker
import datetime
import pandas as pd
import threading
import import_ipynb
from queue import Queue
kws =""
kite = ""
trd_portfolio = Nifty300
subscribe = list(trd_portfolio.values()) #instrument tokens
q = Queue()
def on_ticks(ws,ticks):
q.put(ticks)
def on_close(ws, code, reason):
print('Connection Error')
def on_connect(ws, response):
ws.subscribe(subscribe)
ws.set_mode(ws.MODE_FULL,subscribe)
kws.on_ticks = on_ticks
kws.on_connect = on_connect
kws.on_close = on_close
df_cols = ['Instrument_Token','Time','LTP','Open','Close','BP_1','AP_1'] #defining columns name
data_frame = pd.DataFrame(data=[],columns=df_cols, index=[])
tick_df=pd.DataFrame()
ltp_df=pd.DataFrame()
d1={}
def threader():
while True:
worker = q.get()
get_queue(worker)
q.task_done()
for x in range(10):
t = threading.Thread(target=threader)
# classifying as a daemon, so they will die when the main dies
t.daemon = True
# begins, must come after daemon definition
t.start()
def get_queue(worker):
global data_frame, df_cols, trd_portfolio,d1,tick_df,ltp_df,record
try:
for company_data in worker:
token=company_data['instrument_token']
time=company_data['timestamp']
LTP=company_data['last_price']
#LTQ=company_data['last_quantity']
#LTT=company_data['last_trade_time']
#vol=company_data['volume']
Open=company_data['ohlc']['open']
#high=company_data['ohlc']['high']
#low=company_data['ohlc']['low']
close=company_data['ohlc']['close']
#bo1=company_data['depth']['buy'][0]['orders']
bp1=company_data['depth']['buy'][0]['price']
#bq1=company_data['depth']['buy'][0]['quantity']
#bo2=company_data['depth']['buy'][1]['orders']
#bp2=company_data['depth']['buy'][1]['price']
#bq2=company_data['depth']['buy'][1]['quantity']
#bo3=company_data['depth']['buy'][2]['orders']
#bp3=company_data['depth']['buy'][2]['price']
#bq3=company_data['depth']['buy'][2]['quantity']
#bo4=company_data['depth']['buy'][3]['orders']
#bp4=company_data['depth']['buy'][3]['price']
#bq4=company_data['depth']['buy'][3]['quantity']
#bo5=company_data['depth']['buy'][4]['orders']
#bp5=company_data['depth']['buy'][4]['price']
#bq5=company_data['depth']['buy'][4]['quantity']
#so1=company_data['depth']['sell'][0]['orders']
sp1=company_data['depth']['sell'][0]['price']
#sq1=company_data['depth']['sell'][0]['quantity']
#so2=company_data['depth']['sell'][1]['orders']
#sp2=company_data['depth']['sell'][1]['price']
#sq2=company_data['depth']['sell'][1]['quantity']
#so3=company_data['depth']['sell'][2]['orders']
#sp3=company_data['depth']['sell'][2]['price']
#sq3=company_data['depth']['sell'][2]['quantity']
#so4=company_data['depth']['sell'][3]['orders']
#sp4=company_data['depth']['sell'][3]['price']
#sq4=company_data['depth']['sell'][3]['quantity']
#so5=company_data['depth']['sell'][4]['orders']
#sp5=company_data['depth']['sell'][4]['price']
#sq5=company_data['depth']['sell'][4]['quantity']
timestamp = str(datetime.datetime.now().time())
d1[timestamp] = [token,time,LTP,Open,close,bp1,sp1]
#d1[timestamp] = [token,time,LTP,LTT,LTQ,vol,Open,high,low,close,bo1,bp1,bq1,bo2,bp2,bq2,bo3,bp3,bq3,bo4,bp4,bq4,bo5,bp5,bq5,so1,sp1,sq1,so2,sp2,sq2,so3,sp3,sq3,so4,sp4,sq4,so5,sp5,sq5]
tick_df = pd.DataFrame.from_dict(d1,orient='index', columns=df_cols)
ltp_df = tick_df.reset_index()
except Exception as e:
raise e
atleast more than 250 stock ticks
9:15 will be flooded with traffic, GET should solve your issues and continue with your strategy.
as i am new in python. i think there might be a silly mistake with code.
Since you are specific about complete data, only other solution might be using historical API data.
If you do wish to procure you need to check these,
time taken for "a" script to update in df,
store data in temporary buffer and then push them to df ,this will ensure those few packets are not missed,
your NIC speed-try changing to FULL duplex 1GB/s,
Your processor-?
Can you help me with the process for -
"
If you do wish to procure you need to check these,
time taken for "a" script to update in df,
store data in temporary buffer and then push them to df ,this will ensure those few packets are not missed,
your NIC speed-try changing to FULL duplex 1GB/s "
LAN Speed
Since storing in temp requires bit of extra coding you will have to dig bit deep into codes,
regarding timing you need to store time data before moving data to data frame, once its moved then check the time difference.
There are multiple ways to reduce computation, i have gone through few options and finally settled with sqllite, but my requirements are totally different. as mentioned prior start with the basic and getting into Candling is unnecessary waste of time.
99% traders believe that candle is god for trading but its not.. You can do lot more trading than traditional methods if you know coding, numbers,API and how market works.
LAN Speed
Since storing in temp requires bit of extra coding you will have to dig bit deep into codes,
regarding timing you need to store time data before moving data to data frame, once its moved then check the time difference.
There are multiple ways to reduce computation, i have gone through few options and finally settled with sqllite, but my requirements are totally different. as mentioned prior start with the basic and getting into Candling is unnecessary waste of time.
99% traders believe that candle is god for trading but its not.. You can do lot more trading than traditional methods if you know coding, numbers,API and how market works.
Can you suggest a course or read through which i can dig more into coding and learn about it. i have been struggling with it from a long time.
Earlier i had only backtested using historical data but working with an API is totally different.
Thank You
Thank You
Already went through all of his videos.
Try API trading with one or two stocks,
Try Dict instead of Df,
Try to keep your strategy to prevent losses,
By the time you are "live" you would have figured out most of the things you need.
Any theories or strategies available in Opensources might be Rubbish.If all those theories are followed and generating profit i am sure every trader should earn in 6 figures.
Learn market, Order execution flow,data structure and most of all REST API!