We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
From SO: http://stackoverflow.com/questions/33337798/unicodeencodeerror-when-using-pandas-method-to-sql-on-a-dataframe-with-unicode-c
To reproduce:
from sqlalchemy import create_engine engine = create_engine('sqlite:///:memory:') df = pd.DataFrame([[1,2],[3,4]], columns = [u'\xe9',u'b']) df.to_sql('data', engine, if_exists='replace', index=False)
which gives UnicodeEncodeError: 'ascii' codec can't encode character u'\xe9' in position 0: ordinal not in range(128), because of this line: https://github.com/pydata/pandas/blob/master/pandas/io/sql.py#L856, where it stringifies the individual columns names with str (which fails on python 2 with unicode).
UnicodeEncodeError: 'ascii' codec can't encode character u'\xe9' in position 0: ordinal not in range(128)
str
The text was updated successfully, but these errors were encountered:
I suppose we should just use compat.text_type instead of str
compat.text_type
Sorry, something went wrong.
No branches or pull requests
From SO: http://stackoverflow.com/questions/33337798/unicodeencodeerror-when-using-pandas-method-to-sql-on-a-dataframe-with-unicode-c
To reproduce:
which gives
UnicodeEncodeError: 'ascii' codec can't encode character u'\xe9' in position 0: ordinal not in range(128)
, because of this line: https://github.com/pydata/pandas/blob/master/pandas/io/sql.py#L856, where it stringifies the individual columns names withstr
(which fails on python 2 with unicode).The text was updated successfully, but these errors were encountered: