Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault when urls file has 30000 urls #2

Open
mbsimonovic opened this issue Aug 30, 2017 · 1 comment
Open

Segmentation fault when urls file has 30000 urls #2

mbsimonovic opened this issue Aug 30, 2017 · 1 comment

Comments

@mbsimonovic
Copy link

$ cat 30000urls.txt | wc -l
30000
$ ./ab  -c 100 -v 4 -n 2000 -L 30000urls.txt
Segmentation fault (core dumped)
$

works find when i reduce the number of urls to 20000

@vahldiek
Copy link

vahldiek commented Feb 6, 2018

You can increase the number of allowed urls by increasing the define #define LIST_LINES in ab.c. Its default value is 20,000. Anything beyond 20,000 will cause a segmentation fault, since the array boundaries are not enforced, when reading the urls file.

Repository owner deleted a comment from RojinMajd Feb 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants