Human Generated Data

Title

New York City (woman with hose)

Date

1940, printed later

People

Artist: Helen Levitt, American 1913-2009

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2664

Copyright

© Estate of Helen Levitt

Human Generated Data

Title

New York City (woman with hose)

People

Artist: Helen Levitt, American 1913-2009

Date

1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2664

Copyright

© Estate of Helen Levitt

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 98.6
Apparel 98.6
Person 98.3
Human 98.3
Dress 95.6
Female 93.1
Handrail 88.8
Banister 88.8
Railing 84.4
Woman 78.4
Girl 67.6
Shorts 60.7
Urban 59.4

Clarifai
created on 2023-10-26

people 99.9
one 96.9
adult 96.6
woman 96.1
child 95
monochrome 93.8
vintage 93.8
step 93.7
art 93.5
girl 92.3
street 88.8
portrait 88.8
retro 85.8
wear 85.6
model 85.5
dress 85.1
two 84.8
administration 83.2
old 81.3
vehicle 78.3

Imagga
created on 2022-01-22

turnstile 54.1
gate 42.5
device 38.5
movable barrier 32.5
air conditioner 29.3
mousetrap 28.7
shopping 27.5
basket 25.2
cart 24.3
cooling system 24.3
trap 23.8
metal 23.3
barrier 21.9
container 21.7
equipment 19.8
shop 19.7
buy 18.8
shopping cart 18.7
mechanism 18.6
shopping basket 18.2
business 17.6
3d 17
market 16
sale 14.8
trolley 14.8
object 14.6
supermarket 14.6
store 14.2
commerce 14
box 12.8
technology 12.6
purchase 12.5
silver 12.4
empty 12
finance 11.8
retail 11.4
chrome 11.3
home 11.2
obstruction 11.1
shiny 11.1
money 11.1
push 10.4
handcart 10.3
trade 9.6
symbol 9.4
commercial 9.4
power 9.2
modern 9.1
design 9
steel 8.8
office 8.8
render 8.6
paper 8.6
architecture 7.8
case 7.8
wheeled vehicle 7.7
old 7.7
floor 7.4
close 7.4
style 7.4
stack 7.4
metallic 7.4
security 7.3
computer 7.3
building 7.1
information 7.1
interior 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

black and white 95.2
indoor 88.2
stairs 86
clothing 85.1
person 77.7
monochrome 75.8
text 71.5
street 57.6
footwear 52.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Female, 70.2%
Sad 65.2%
Confused 15.3%
Calm 12%
Fear 4.2%
Surprised 1%
Angry 0.8%
Disgusted 0.8%
Happy 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Categories

Imagga

interior objects 98.1%
paintings art 1.4%

Captions

Microsoft
created on 2022-01-22

a person in a cage 64%
a person sitting in a cage 50.7%
a person sitting on a cage 44.4%