New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exception ignored. RuntimeError: Event loop is closed - problem with the async await branch #14
Comments
Nothing to worry about. Just check if you got the result you wanted. Try running it again, seems like a connection lost error to me. If the error still persists let me know I will look into it. |
the error is still pesisting ; Exception ignored in: <function _ProactorBasePipeTransport.__del__ at 0x00000131DBEB3820>
Traceback (most recent call last):
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.9_3.9.3568.0_x64__qbz5n2kfra8p0\lib\asyncio\proactor_events.py", line 116, in __del__
self.close()
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.9_3.9.3568.0_x64__qbz5n2kfra8p0\lib\asyncio\proactor_events.py", line 108, in close
self._loop.call_soon(self._call_connection_lost, None)
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.9_3.9.3568.0_x64__qbz5n2kfra8p0\lib\asyncio\base_events.py", line 751, in call_soon
self._check_closed()
File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.9_3.9.3568.0_x64__qbz5n2kfra8p0\lib\asyncio\base_events.py", line 515, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed |
Hey @ihabpalamino , |
here is my code import pandas as pd
import json # Import the json module
from tweeterpy import TweeterPy, util
twitter = TweeterPy()
twitter.login("*****", "**** ")
data = []
userid = twitter.get_user_id("2MInteractive")
user_tweets = twitter.get_user_tweets(userid, from_date="2023-07-24", to_date="2023-07-27",total=5)
for result in user_tweets:
for tweet_entry in result['data']:
tweet_data = tweet_entry.get('content', {}).get('itemContent', {}).get('tweet_results', {}).get('result', {}).get('legacy', {})
created_at_list = tweet_data.get('created_at', [])
if len(created_at_list) >= 2:
second_created_at = created_at_list
else:
second_created_at = None
# Append the data to the existing list for the current tweet
data.append({
"id_post":util.find_nested_key(tweet_entry, "screen_name"),
"username": util.find_nested_key(tweet_entry, "screen_name"),
"content": util.find_nested_key(tweet_entry, "full_text"),
"Date": second_created_at,
"viewVideo": util.find_nested_key(tweet_entry, "viewCount"),
'likecount': util.find_nested_key(tweet_entry, "favorite_count"),
'shares': util.find_nested_key(tweet_entry, "retweet_count"),
'comments': util.find_nested_key(tweet_entry, "reply_count"),
'postUrl':util.find_nested_key(tweet_entry, "expanded_url"),
"platformname":"twitter"
})
# Create a pandas DataFrame from the data list
df = pd.DataFrame(data, columns=["id_post", "username","content", "Date" ,"viewVideo",'likecount', 'shares','comments','postUrl', "platformname"])
# Convert DataFrame to JSON format and display it
data_json = df.to_json(orient='records', indent=4, force_ascii=False)
print(data_json)
# Save the DataFrame to a CSV file
df.to_csv('C:/Users/HP Probook/OneDrive/Images/Bureau/CsvScraping/twitter.csv', index=False, encoding='utf-8') |
Let me make some changes to the code, I will let u know when it's done. Edit : Seems like there is some issue with Windows. I was testing it on ubuntu so that might be the reason I didn't get the error. For the time being, here is a quick fix if you want to apply, otherwise I will update the code shortly. |
thanks waiting for you to resolve the issue |
Hi @ihabpalamino |
okey i will check if is it possible to push the date since until to the master branch because it works fine already |
i have tried it and it does works fine no more error thank you sir so much |
As the issue seems to have been resolved after this commit 96275ad , I am closing the issue. |
thank u so much hope it wont take a lot of time really need it for my final project studies thank u for all ur hard work |
while trying to scrape using the branch async_await i got the result that i have wanted but i got this error
here is my code
The text was updated successfully, but these errors were encountered: