Can I find out the allocation request that caused my Python MemoryError?










16















Context



My small Python script uses a library to work on some relatively large data. The standard algorithm for this task is a dynamic programming algorithm, so presumably the library "under the hood" allocates a large array to keep track of the partial results of the DP. Indeed, when I try to give it fairly large input, it immediately gives a MemoryError.



Preferably without digging into the depths of the library, I want to figure out if it is worth trying this algorithm on a different machine with more memory, or trying to trim down a bit on my input size, or if it's a lost cause for the data size I am trying to use.



Question



When my Python code throws a MemoryError, is there a "top-down" way for me to investigate what the size of memory was that my code tried to allocate which caused the error, e.g. by inspecting the error object?










share|improve this question

















  • 5





    This is a good overview on MemoryError: airbrake.io/blog/python-exception-handling/memoryerror . What is the DP library you're using? What is the size of the very large input? Similar to the forced out of bounds in the blog post, you could try looping through and allocating memory based on N and throwing it away until it fails. That'll tell you where N breaks down. As for your direct question, " how to investigate what the size of memory was that my code tried to allocate which caused the error," I did not see anything immediately obvious. Interesting question!

    – Scott Skiles
    Nov 13 '18 at 13:49











  • @ScottSkiles, at this point my practical problem has more or less been solved with an approximate/probabilistic solution, and it's just a curiosity for me about error objects in Python. The context is just to make clear why one might care about the problem, and mostly separate from the actual question. The algorithm was for computing a variant of Levenshtein distance for approximate substring matching, and my data was (if I recall correctly) around a million characters.

    – Mees de Vries
    Nov 13 '18 at 14:38






  • 1





    From the article referenced by @ScottSkiles, it seems like you could use psutil.virtual_memory() in your error handling to get the memory usage data you are looking for. That said, I am not aware of a way to get this info from the error itself per your question.

    – benvc
    Nov 13 '18 at 14:58











  • @ScottSkiles @benvc if either of you would turn the fact about psutil into an answer I'd be happy to accept and award bounty.

    – Mees de Vries
    Nov 15 '18 at 10:41












  • @benvc go ahead. I'm traveling.

    – Scott Skiles
    Nov 16 '18 at 11:27















16















Context



My small Python script uses a library to work on some relatively large data. The standard algorithm for this task is a dynamic programming algorithm, so presumably the library "under the hood" allocates a large array to keep track of the partial results of the DP. Indeed, when I try to give it fairly large input, it immediately gives a MemoryError.



Preferably without digging into the depths of the library, I want to figure out if it is worth trying this algorithm on a different machine with more memory, or trying to trim down a bit on my input size, or if it's a lost cause for the data size I am trying to use.



Question



When my Python code throws a MemoryError, is there a "top-down" way for me to investigate what the size of memory was that my code tried to allocate which caused the error, e.g. by inspecting the error object?










share|improve this question

















  • 5





    This is a good overview on MemoryError: airbrake.io/blog/python-exception-handling/memoryerror . What is the DP library you're using? What is the size of the very large input? Similar to the forced out of bounds in the blog post, you could try looping through and allocating memory based on N and throwing it away until it fails. That'll tell you where N breaks down. As for your direct question, " how to investigate what the size of memory was that my code tried to allocate which caused the error," I did not see anything immediately obvious. Interesting question!

    – Scott Skiles
    Nov 13 '18 at 13:49











  • @ScottSkiles, at this point my practical problem has more or less been solved with an approximate/probabilistic solution, and it's just a curiosity for me about error objects in Python. The context is just to make clear why one might care about the problem, and mostly separate from the actual question. The algorithm was for computing a variant of Levenshtein distance for approximate substring matching, and my data was (if I recall correctly) around a million characters.

    – Mees de Vries
    Nov 13 '18 at 14:38






  • 1





    From the article referenced by @ScottSkiles, it seems like you could use psutil.virtual_memory() in your error handling to get the memory usage data you are looking for. That said, I am not aware of a way to get this info from the error itself per your question.

    – benvc
    Nov 13 '18 at 14:58











  • @ScottSkiles @benvc if either of you would turn the fact about psutil into an answer I'd be happy to accept and award bounty.

    – Mees de Vries
    Nov 15 '18 at 10:41












  • @benvc go ahead. I'm traveling.

    – Scott Skiles
    Nov 16 '18 at 11:27













16












16








16


3






Context



My small Python script uses a library to work on some relatively large data. The standard algorithm for this task is a dynamic programming algorithm, so presumably the library "under the hood" allocates a large array to keep track of the partial results of the DP. Indeed, when I try to give it fairly large input, it immediately gives a MemoryError.



Preferably without digging into the depths of the library, I want to figure out if it is worth trying this algorithm on a different machine with more memory, or trying to trim down a bit on my input size, or if it's a lost cause for the data size I am trying to use.



Question



When my Python code throws a MemoryError, is there a "top-down" way for me to investigate what the size of memory was that my code tried to allocate which caused the error, e.g. by inspecting the error object?










share|improve this question














Context



My small Python script uses a library to work on some relatively large data. The standard algorithm for this task is a dynamic programming algorithm, so presumably the library "under the hood" allocates a large array to keep track of the partial results of the DP. Indeed, when I try to give it fairly large input, it immediately gives a MemoryError.



Preferably without digging into the depths of the library, I want to figure out if it is worth trying this algorithm on a different machine with more memory, or trying to trim down a bit on my input size, or if it's a lost cause for the data size I am trying to use.



Question



When my Python code throws a MemoryError, is there a "top-down" way for me to investigate what the size of memory was that my code tried to allocate which caused the error, e.g. by inspecting the error object?







python python-3.x error-handling out-of-memory






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Sep 20 '18 at 11:50









Mees de VriesMees de Vries

308116




308116







  • 5





    This is a good overview on MemoryError: airbrake.io/blog/python-exception-handling/memoryerror . What is the DP library you're using? What is the size of the very large input? Similar to the forced out of bounds in the blog post, you could try looping through and allocating memory based on N and throwing it away until it fails. That'll tell you where N breaks down. As for your direct question, " how to investigate what the size of memory was that my code tried to allocate which caused the error," I did not see anything immediately obvious. Interesting question!

    – Scott Skiles
    Nov 13 '18 at 13:49











  • @ScottSkiles, at this point my practical problem has more or less been solved with an approximate/probabilistic solution, and it's just a curiosity for me about error objects in Python. The context is just to make clear why one might care about the problem, and mostly separate from the actual question. The algorithm was for computing a variant of Levenshtein distance for approximate substring matching, and my data was (if I recall correctly) around a million characters.

    – Mees de Vries
    Nov 13 '18 at 14:38






  • 1





    From the article referenced by @ScottSkiles, it seems like you could use psutil.virtual_memory() in your error handling to get the memory usage data you are looking for. That said, I am not aware of a way to get this info from the error itself per your question.

    – benvc
    Nov 13 '18 at 14:58











  • @ScottSkiles @benvc if either of you would turn the fact about psutil into an answer I'd be happy to accept and award bounty.

    – Mees de Vries
    Nov 15 '18 at 10:41












  • @benvc go ahead. I'm traveling.

    – Scott Skiles
    Nov 16 '18 at 11:27












  • 5





    This is a good overview on MemoryError: airbrake.io/blog/python-exception-handling/memoryerror . What is the DP library you're using? What is the size of the very large input? Similar to the forced out of bounds in the blog post, you could try looping through and allocating memory based on N and throwing it away until it fails. That'll tell you where N breaks down. As for your direct question, " how to investigate what the size of memory was that my code tried to allocate which caused the error," I did not see anything immediately obvious. Interesting question!

    – Scott Skiles
    Nov 13 '18 at 13:49











  • @ScottSkiles, at this point my practical problem has more or less been solved with an approximate/probabilistic solution, and it's just a curiosity for me about error objects in Python. The context is just to make clear why one might care about the problem, and mostly separate from the actual question. The algorithm was for computing a variant of Levenshtein distance for approximate substring matching, and my data was (if I recall correctly) around a million characters.

    – Mees de Vries
    Nov 13 '18 at 14:38






  • 1





    From the article referenced by @ScottSkiles, it seems like you could use psutil.virtual_memory() in your error handling to get the memory usage data you are looking for. That said, I am not aware of a way to get this info from the error itself per your question.

    – benvc
    Nov 13 '18 at 14:58











  • @ScottSkiles @benvc if either of you would turn the fact about psutil into an answer I'd be happy to accept and award bounty.

    – Mees de Vries
    Nov 15 '18 at 10:41












  • @benvc go ahead. I'm traveling.

    – Scott Skiles
    Nov 16 '18 at 11:27







5




5





This is a good overview on MemoryError: airbrake.io/blog/python-exception-handling/memoryerror . What is the DP library you're using? What is the size of the very large input? Similar to the forced out of bounds in the blog post, you could try looping through and allocating memory based on N and throwing it away until it fails. That'll tell you where N breaks down. As for your direct question, " how to investigate what the size of memory was that my code tried to allocate which caused the error," I did not see anything immediately obvious. Interesting question!

– Scott Skiles
Nov 13 '18 at 13:49





This is a good overview on MemoryError: airbrake.io/blog/python-exception-handling/memoryerror . What is the DP library you're using? What is the size of the very large input? Similar to the forced out of bounds in the blog post, you could try looping through and allocating memory based on N and throwing it away until it fails. That'll tell you where N breaks down. As for your direct question, " how to investigate what the size of memory was that my code tried to allocate which caused the error," I did not see anything immediately obvious. Interesting question!

– Scott Skiles
Nov 13 '18 at 13:49













@ScottSkiles, at this point my practical problem has more or less been solved with an approximate/probabilistic solution, and it's just a curiosity for me about error objects in Python. The context is just to make clear why one might care about the problem, and mostly separate from the actual question. The algorithm was for computing a variant of Levenshtein distance for approximate substring matching, and my data was (if I recall correctly) around a million characters.

– Mees de Vries
Nov 13 '18 at 14:38





@ScottSkiles, at this point my practical problem has more or less been solved with an approximate/probabilistic solution, and it's just a curiosity for me about error objects in Python. The context is just to make clear why one might care about the problem, and mostly separate from the actual question. The algorithm was for computing a variant of Levenshtein distance for approximate substring matching, and my data was (if I recall correctly) around a million characters.

– Mees de Vries
Nov 13 '18 at 14:38




1




1





From the article referenced by @ScottSkiles, it seems like you could use psutil.virtual_memory() in your error handling to get the memory usage data you are looking for. That said, I am not aware of a way to get this info from the error itself per your question.

– benvc
Nov 13 '18 at 14:58





From the article referenced by @ScottSkiles, it seems like you could use psutil.virtual_memory() in your error handling to get the memory usage data you are looking for. That said, I am not aware of a way to get this info from the error itself per your question.

– benvc
Nov 13 '18 at 14:58













@ScottSkiles @benvc if either of you would turn the fact about psutil into an answer I'd be happy to accept and award bounty.

– Mees de Vries
Nov 15 '18 at 10:41






@ScottSkiles @benvc if either of you would turn the fact about psutil into an answer I'd be happy to accept and award bounty.

– Mees de Vries
Nov 15 '18 at 10:41














@benvc go ahead. I'm traveling.

– Scott Skiles
Nov 16 '18 at 11:27





@benvc go ahead. I'm traveling.

– Scott Skiles
Nov 16 '18 at 11:27












3 Answers
3






active

oldest

votes


















2





+50









You can't see from the MemoryError exception, and the exception is raised for any situation where memory allocation failed, including Python internals that do not directly connect to code creating new Python data structures; some modules create locks or other support objects and those operations can fail due to memory having run out.



You also can't necessarily know how much memory would be required to have the whole operation succeed. If the library creates several data structures over the course of operation, trying to allocate memory for a string used as a dictionary key could be the last straw, or it could be copying the whole existing data structure for mutation, or anything in between, but this doesn't say anything about how much memory is going to be needed, in addition, for the remainder of the process.



That said, Python can give you detailed information on what memory allocations are being made, and when, and where, using the tracemalloc module. Using that module and an experimental approach, you could estimate how much memory your data set would require to complete.



The trick is to find data sets for which the process can be completed. You'd want to find data sets of different sizes, and you can then measure how much memory those data structures require. You'd create snapshots before and after with tracemalloc.take_snapshot(), compare differences and statistics between the snapshots for those data sets, and perhaps you can extrapolate from that information how much more memory your larger data set would need. It depends, of course, on the nature of the operation and the datasets, but if there is any kind of pattern tracemalloc is your best shot to discover it.






share|improve this answer






























    4














    You can see the memory allocation with Pyampler but you will need to add the debugging statements locally in the library that you are using. Assuming a standard PyPi package, here are the steps:



    1. Clone the package locally.

    2 Use summary module of Pyampler. Place following inside the main recursion method,



     from pympler import summary
    def data_intensive_method(data_xyz)
    sum1 = summary.summarize(all_objects)
    summary.print_(sum1)
    ...


    1. Run pip install -e . to install the edited package locally.

    2. Run your main program and check the console for memory usage at each iteration.





    share|improve this answer






























      1














      It appears that MemoryError is not created with any associated data:



      def crash():
      x = 32 * 10 ** 9
      return 'a' * x

      try:
      crash()
      except MemoryError as e:
      print(vars(e)) # prints:


      This makes sense - how could it if no memory is left?



      I don't think there's an easy way out. You can start from the traceback that the MemoryError causes and investigate with a debugger or use a memory profiler like pympler (or psutil as suggested in the comments).






      share|improve this answer























        Your Answer






        StackExchange.ifUsing("editor", function ()
        StackExchange.using("externalEditor", function ()
        StackExchange.using("snippets", function ()
        StackExchange.snippets.init();
        );
        );
        , "code-snippets");

        StackExchange.ready(function()
        var channelOptions =
        tags: "".split(" "),
        id: "1"
        ;
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function()
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled)
        StackExchange.using("snippets", function()
        createEditor();
        );

        else
        createEditor();

        );

        function createEditor()
        StackExchange.prepareEditor(
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: true,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: 10,
        bindNavPrevention: true,
        postfix: "",
        imageUploader:
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        ,
        onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        );



        );













        draft saved

        draft discarded


















        StackExchange.ready(
        function ()
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f52424543%2fcan-i-find-out-the-allocation-request-that-caused-my-python-memoryerror%23new-answer', 'question_page');

        );

        Post as a guest















        Required, but never shown

























        3 Answers
        3






        active

        oldest

        votes








        3 Answers
        3






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        2





        +50









        You can't see from the MemoryError exception, and the exception is raised for any situation where memory allocation failed, including Python internals that do not directly connect to code creating new Python data structures; some modules create locks or other support objects and those operations can fail due to memory having run out.



        You also can't necessarily know how much memory would be required to have the whole operation succeed. If the library creates several data structures over the course of operation, trying to allocate memory for a string used as a dictionary key could be the last straw, or it could be copying the whole existing data structure for mutation, or anything in between, but this doesn't say anything about how much memory is going to be needed, in addition, for the remainder of the process.



        That said, Python can give you detailed information on what memory allocations are being made, and when, and where, using the tracemalloc module. Using that module and an experimental approach, you could estimate how much memory your data set would require to complete.



        The trick is to find data sets for which the process can be completed. You'd want to find data sets of different sizes, and you can then measure how much memory those data structures require. You'd create snapshots before and after with tracemalloc.take_snapshot(), compare differences and statistics between the snapshots for those data sets, and perhaps you can extrapolate from that information how much more memory your larger data set would need. It depends, of course, on the nature of the operation and the datasets, but if there is any kind of pattern tracemalloc is your best shot to discover it.






        share|improve this answer



























          2





          +50









          You can't see from the MemoryError exception, and the exception is raised for any situation where memory allocation failed, including Python internals that do not directly connect to code creating new Python data structures; some modules create locks or other support objects and those operations can fail due to memory having run out.



          You also can't necessarily know how much memory would be required to have the whole operation succeed. If the library creates several data structures over the course of operation, trying to allocate memory for a string used as a dictionary key could be the last straw, or it could be copying the whole existing data structure for mutation, or anything in between, but this doesn't say anything about how much memory is going to be needed, in addition, for the remainder of the process.



          That said, Python can give you detailed information on what memory allocations are being made, and when, and where, using the tracemalloc module. Using that module and an experimental approach, you could estimate how much memory your data set would require to complete.



          The trick is to find data sets for which the process can be completed. You'd want to find data sets of different sizes, and you can then measure how much memory those data structures require. You'd create snapshots before and after with tracemalloc.take_snapshot(), compare differences and statistics between the snapshots for those data sets, and perhaps you can extrapolate from that information how much more memory your larger data set would need. It depends, of course, on the nature of the operation and the datasets, but if there is any kind of pattern tracemalloc is your best shot to discover it.






          share|improve this answer

























            2





            +50







            2





            +50



            2




            +50





            You can't see from the MemoryError exception, and the exception is raised for any situation where memory allocation failed, including Python internals that do not directly connect to code creating new Python data structures; some modules create locks or other support objects and those operations can fail due to memory having run out.



            You also can't necessarily know how much memory would be required to have the whole operation succeed. If the library creates several data structures over the course of operation, trying to allocate memory for a string used as a dictionary key could be the last straw, or it could be copying the whole existing data structure for mutation, or anything in between, but this doesn't say anything about how much memory is going to be needed, in addition, for the remainder of the process.



            That said, Python can give you detailed information on what memory allocations are being made, and when, and where, using the tracemalloc module. Using that module and an experimental approach, you could estimate how much memory your data set would require to complete.



            The trick is to find data sets for which the process can be completed. You'd want to find data sets of different sizes, and you can then measure how much memory those data structures require. You'd create snapshots before and after with tracemalloc.take_snapshot(), compare differences and statistics between the snapshots for those data sets, and perhaps you can extrapolate from that information how much more memory your larger data set would need. It depends, of course, on the nature of the operation and the datasets, but if there is any kind of pattern tracemalloc is your best shot to discover it.






            share|improve this answer













            You can't see from the MemoryError exception, and the exception is raised for any situation where memory allocation failed, including Python internals that do not directly connect to code creating new Python data structures; some modules create locks or other support objects and those operations can fail due to memory having run out.



            You also can't necessarily know how much memory would be required to have the whole operation succeed. If the library creates several data structures over the course of operation, trying to allocate memory for a string used as a dictionary key could be the last straw, or it could be copying the whole existing data structure for mutation, or anything in between, but this doesn't say anything about how much memory is going to be needed, in addition, for the remainder of the process.



            That said, Python can give you detailed information on what memory allocations are being made, and when, and where, using the tracemalloc module. Using that module and an experimental approach, you could estimate how much memory your data set would require to complete.



            The trick is to find data sets for which the process can be completed. You'd want to find data sets of different sizes, and you can then measure how much memory those data structures require. You'd create snapshots before and after with tracemalloc.take_snapshot(), compare differences and statistics between the snapshots for those data sets, and perhaps you can extrapolate from that information how much more memory your larger data set would need. It depends, of course, on the nature of the operation and the datasets, but if there is any kind of pattern tracemalloc is your best shot to discover it.







            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Nov 17 '18 at 17:25









            Martijn PietersMartijn Pieters

            723k14125372343




            723k14125372343























                4














                You can see the memory allocation with Pyampler but you will need to add the debugging statements locally in the library that you are using. Assuming a standard PyPi package, here are the steps:



                1. Clone the package locally.

                2 Use summary module of Pyampler. Place following inside the main recursion method,



                 from pympler import summary
                def data_intensive_method(data_xyz)
                sum1 = summary.summarize(all_objects)
                summary.print_(sum1)
                ...


                1. Run pip install -e . to install the edited package locally.

                2. Run your main program and check the console for memory usage at each iteration.





                share|improve this answer



























                  4














                  You can see the memory allocation with Pyampler but you will need to add the debugging statements locally in the library that you are using. Assuming a standard PyPi package, here are the steps:



                  1. Clone the package locally.

                  2 Use summary module of Pyampler. Place following inside the main recursion method,



                   from pympler import summary
                  def data_intensive_method(data_xyz)
                  sum1 = summary.summarize(all_objects)
                  summary.print_(sum1)
                  ...


                  1. Run pip install -e . to install the edited package locally.

                  2. Run your main program and check the console for memory usage at each iteration.





                  share|improve this answer

























                    4












                    4








                    4







                    You can see the memory allocation with Pyampler but you will need to add the debugging statements locally in the library that you are using. Assuming a standard PyPi package, here are the steps:



                    1. Clone the package locally.

                    2 Use summary module of Pyampler. Place following inside the main recursion method,



                     from pympler import summary
                    def data_intensive_method(data_xyz)
                    sum1 = summary.summarize(all_objects)
                    summary.print_(sum1)
                    ...


                    1. Run pip install -e . to install the edited package locally.

                    2. Run your main program and check the console for memory usage at each iteration.





                    share|improve this answer













                    You can see the memory allocation with Pyampler but you will need to add the debugging statements locally in the library that you are using. Assuming a standard PyPi package, here are the steps:



                    1. Clone the package locally.

                    2 Use summary module of Pyampler. Place following inside the main recursion method,



                     from pympler import summary
                    def data_intensive_method(data_xyz)
                    sum1 = summary.summarize(all_objects)
                    summary.print_(sum1)
                    ...


                    1. Run pip install -e . to install the edited package locally.

                    2. Run your main program and check the console for memory usage at each iteration.






                    share|improve this answer












                    share|improve this answer



                    share|improve this answer










                    answered Nov 19 '18 at 11:20









                    amirathiamirathi

                    8614




                    8614





















                        1














                        It appears that MemoryError is not created with any associated data:



                        def crash():
                        x = 32 * 10 ** 9
                        return 'a' * x

                        try:
                        crash()
                        except MemoryError as e:
                        print(vars(e)) # prints:


                        This makes sense - how could it if no memory is left?



                        I don't think there's an easy way out. You can start from the traceback that the MemoryError causes and investigate with a debugger or use a memory profiler like pympler (or psutil as suggested in the comments).






                        share|improve this answer



























                          1














                          It appears that MemoryError is not created with any associated data:



                          def crash():
                          x = 32 * 10 ** 9
                          return 'a' * x

                          try:
                          crash()
                          except MemoryError as e:
                          print(vars(e)) # prints:


                          This makes sense - how could it if no memory is left?



                          I don't think there's an easy way out. You can start from the traceback that the MemoryError causes and investigate with a debugger or use a memory profiler like pympler (or psutil as suggested in the comments).






                          share|improve this answer

























                            1












                            1








                            1







                            It appears that MemoryError is not created with any associated data:



                            def crash():
                            x = 32 * 10 ** 9
                            return 'a' * x

                            try:
                            crash()
                            except MemoryError as e:
                            print(vars(e)) # prints:


                            This makes sense - how could it if no memory is left?



                            I don't think there's an easy way out. You can start from the traceback that the MemoryError causes and investigate with a debugger or use a memory profiler like pympler (or psutil as suggested in the comments).






                            share|improve this answer













                            It appears that MemoryError is not created with any associated data:



                            def crash():
                            x = 32 * 10 ** 9
                            return 'a' * x

                            try:
                            crash()
                            except MemoryError as e:
                            print(vars(e)) # prints:


                            This makes sense - how could it if no memory is left?



                            I don't think there's an easy way out. You can start from the traceback that the MemoryError causes and investigate with a debugger or use a memory profiler like pympler (or psutil as suggested in the comments).







                            share|improve this answer












                            share|improve this answer



                            share|improve this answer










                            answered Nov 16 '18 at 22:14









                            roeen30roeen30

                            56629




                            56629



























                                draft saved

                                draft discarded
















































                                Thanks for contributing an answer to Stack Overflow!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid


                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.

                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function ()
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f52424543%2fcan-i-find-out-the-allocation-request-that-caused-my-python-memoryerror%23new-answer', 'question_page');

                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

                                Crossroads (UK TV series)

                                ữḛḳṊẴ ẋ,Ẩṙ,ỹḛẪẠứụỿṞṦ,Ṉẍừ,ứ Ị,Ḵ,ṏ ṇỪḎḰṰọửḊ ṾḨḮữẑỶṑỗḮṣṉẃ Ữẩụ,ṓ,ḹẕḪḫỞṿḭ ỒṱṨẁṋṜ ḅẈ ṉ ứṀḱṑỒḵ,ḏ,ḊḖỹẊ Ẻḷổ,ṥ ẔḲẪụḣể Ṱ ḭỏựẶ Ồ Ṩ,ẂḿṡḾồ ỗṗṡịṞẤḵṽẃ ṸḒẄẘ,ủẞẵṦṟầṓế