We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi, Hope you can see this message, as I am unable to post in your loginpass repo.
Thank you for making loginpass that works with Authlib 1.x, extremely helpful and I would like to use your branch instead of the broken official repo.
However, there is an error when using this code with Google or Azure.
at _flask.py, line 63, in auth user_info = remote.parse_id_token(token) TypeError: OpenIDMixin.parse_id_token() missing 1 required positional argument: 'nonce'
The solution is described at lepture/authlib#400 That is, replace the problematic line with:
user_info = token['userinfo']
Could you patch the code at your convenience, so we don't need to make a new branch but use yours?
Thank you, feel free to delete this thread afterward.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Hi, Hope you can see this message, as I am unable to post in your loginpass repo.
Thank you for making loginpass that works with Authlib 1.x, extremely helpful and I would like to use your branch instead of the broken official repo.
However, there is an error when using this code with Google or Azure.
at _flask.py, line 63, in auth
user_info = remote.parse_id_token(token)
TypeError: OpenIDMixin.parse_id_token() missing 1 required positional argument: 'nonce'
The solution is described at lepture/authlib#400
That is, replace the problematic line with:
Could you patch the code at your convenience, so we don't need to make a new branch but use yours?
Thank you, feel free to delete this thread afterward.
The text was updated successfully, but these errors were encountered: