Skip to content

Latest commit

 

History

History
12 lines (11 loc) · 1.93 KB

File metadata and controls

12 lines (11 loc) · 1.93 KB

Referências

VASWANI, Ashish; SHAZEER, Noam; PARMAR, Niki; USZKOREIT, Jakob; JONES, Llion; GOMEZ, Aidan. N.; KAISER, Lukasz.; POLOSUKHIN, Illia. Attention is all you need. Advances in Neural Information Processing Systems, [s.l.], v. 30, p. 5998–6008, 2017. Disponível em: https://doi.org/10.48550/arXiv.1706.03762. Acesso em: 1 dez. 2024.
GOOGLE. Search language understanding with BERT. Disponível em: https://blog.google/products/search/search-language-understanding-bert/. Acesso em: 1 dez. 2024.
AWS. What is Prompt Engineering. Disponível em: https://aws.amazon.com/pt/what-is/prompt-engineering/ . Acesso em 1 dez. 2024. CLARK, Kevin; KASSNER, Nora; AKKAYA, Volkan; GOYAL, Nitish; JURAFSKY, Dan. What does BERT look at? An analysis of BERT's attention. Proceedings of the Association for Computational Linguistics (ACL), 2019. Disponível em: https://arxiv.org/abs/1906.04341. Acesso em: 1 dez. 2024.
GOLDBERG, Yoav. Assessing BERT's Syntactic Abilities. arXiv preprint arXiv:1901.05287, 2019. Disponível em: https://arxiv.org/abs/1901.05287. Acesso em: 1 dez. 2024.
Neuralnetworksanddeeplearning. How the backpropagation algorithm works. Disponível em: http://neuralnetworksanddeeplearning.com/chap2.html . Acesso em 1 dez. 2024.
TUNSTALL, Lewis; WERRA VON, Leandro; WOLF, Thomas. Natural Language Processing with Transformers: Building Language Applications with Hugging Face. O'Reilly Media, Inc. 2022.
RAU, David; WANG, Shuai; DÉJEAN, Hervé; CLINCHANT, Stéphane. Context Embeddings for Efficient Answer Generation in RAG. arXiv preprint arXiv:2407.09252, 2024. Disponível em: https://arxiv.org/abs/2407.09252. Acesso em: 1 dez. 2024.
ELASTIC. Generative AI: transformers explained. Disponível em: https://www.elastic.co/search-labs/blog/generative-ai-transformers-explained. Acesso em: 2 dez. 2024.
MODELO DETECÇÃO DE EPI: EzPoint