Human Generated Data

Title

New York City (children)

Date

1940, printed later

People

Artist: Helen Levitt, American 1913-2009

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2666

Copyright

© Estate of Helen Levitt

Human Generated Data

Title

New York City (children)

People

Artist: Helen Levitt, American 1913-2009

Date

1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2666

Copyright

© Estate of Helen Levitt

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.5
Human 99.5
Person 99.4
Person 99.3
Person 97.4
Person 94.7
Clothing 94.6
Apparel 94.6
Face 81.9
Shorts 80.7
People 73.4
Performer 61.6
Wall 60.9
Art 60.3
Brick 58.8
Urban 56.1

Clarifai
created on 2023-10-26

people 100
group 99.6
child 99.6
man 98.1
adult 97.5
group together 97.3
boy 97
family 95
war 94.9
woman 93.5
many 93.5
several 91.7
documentary 91.4
monochrome 89.3
administration 87.4
soldier 86.9
home 85.9
recreation 85.8
portrait 85.6
veil 85.4

Imagga
created on 2022-01-22

man 21.5
old 19.5
silhouette 18.2
black 17.5
people 17.3
business 17
male 15.6
grunge 15.3
person 14.9
businessman 14.1
art 12.5
building 12.4
architecture 12.4
world 12.2
vintage 11.6
success 11.3
ancient 11.2
stone 11.2
group 10.5
office 9.9
fortress 9.6
graphic 9.5
work 9.4
newspaper 9.2
structure 9.1
statue 9
history 8.9
sky 8.9
men 8.6
travel 8.4
city 8.3
creation 8.2
symbol 8.1
shadow 8.1
clothing 8.1
product 7.9
corporate 7.7
pillory 7.7
wall 7.7
finance 7.6
manager 7.4
light 7.3
historic 7.3
alone 7.3
aged 7.2
religion 7.2
team 7.2
portrait 7.1
sculpture 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 98.1
clothing 97.6
person 97.2
gallery 93.6
man 89.8
outdoor 86.9
room 76.3
scene 74.4
woman 70
posing 65.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-24
Gender Female, 90.6%
Calm 49%
Fear 20.2%
Happy 11.4%
Surprised 9.4%
Confused 4.3%
Sad 2.8%
Angry 1.8%
Disgusted 1%

AWS Rekognition

Age 13-21
Gender Male, 51.5%
Calm 96.1%
Confused 2%
Disgusted 0.6%
Sad 0.6%
Angry 0.3%
Surprised 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 18-24
Gender Male, 81%
Calm 98.5%
Fear 0.5%
Confused 0.2%
Angry 0.2%
Sad 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 12-20
Gender Female, 56.8%
Fear 35.2%
Calm 26.6%
Surprised 16.2%
Confused 5.7%
Angry 5%
Happy 4.3%
Disgusted 3.9%
Sad 3.2%

Microsoft Cognitive Services

Age 8
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories