Human Generated Data

Title

New York City (children with mirror frame)

Date

1940, printed later

People

Artist: Helen Levitt, American 1913-2009

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2663

Copyright

© Estate of Helen Levitt

Human Generated Data

Title

New York City (children with mirror frame)

People

Artist: Helen Levitt, American 1913-2009

Date

1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2663

Copyright

© Estate of Helen Levitt

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.4
Person 99.2
Person 99.2
Person 99.1
Person 98.8
Person 98.6
Person 97.3
Person 97.3
Person 97.2
Bicycle 92.6
Transportation 92.6
Vehicle 92.6
Bike 92.6
Person 92.3
Person 91
Person 80.1
Person 78.1
Urban 76.2
Person 74.9
Poster 67.6
Advertisement 67.6
Clothing 66.9
Apparel 66.9
Person 64.7
Person 63.7
Building 62.5
Person 58.4
Face 56.9
Text 56.1
Kid 56
Child 56

Clarifai
created on 2023-10-26

people 100
monochrome 99
child 98.8
street 98.8
adult 98.5
group 98
group together 98
woman 97.6
man 96.6
many 92.7
war 90.9
boy 90.3
administration 87.9
two 87.4
wear 85.9
documentary 85.5
three 81.4
merchant 80.5
newspaper 79.7
vehicle 79.1

Imagga
created on 2022-01-22

shop 48.6
mercantile establishment 34.8
shoe shop 26.5
place of business 23.2
toyshop 21.8
old 20.9
city 19.1
art 16.3
man 16.2
building 14.8
ancient 14.7
vintage 14.1
architecture 14
history 13.4
people 13.4
antique 13
religion 12.5
black 12
stall 12
newspaper 12
statue 11.6
establishment 11.6
dirty 10.8
person 10.8
male 10.7
urban 10.5
product 10.3
wall 10.3
historic 10.1
dress 9.9
soldier 9.8
clothing 9.6
window 9.5
decoration 9.5
culture 9.4
religious 9.4
war 8.7
men 8.6
grunge 8.5
texture 8.3
creation 8.3
aged 8.1
landmark 8.1
new 8.1
mask 8
design 7.9
artistic 7.8
sculpture 7.8
portrait 7.8
historical 7.5
traditional 7.5
retro 7.4
street 7.4
uniform 7.3
paint 7.2
detail 7.2
color 7.2
adult 7.2
holiday 7.2
travel 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

clothing 99.5
text 99
person 91.4
man 91
black and white 74.4
footwear 67.2
newspaper 56.8
room 47.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 4-12
Gender Female, 99.5%
Sad 95.6%
Calm 2.2%
Confused 0.9%
Angry 0.6%
Disgusted 0.3%
Surprised 0.2%
Fear 0.2%
Happy 0.1%

AWS Rekognition

Age 23-33
Gender Female, 54.3%
Happy 66%
Fear 25.1%
Surprised 3.2%
Angry 2.8%
Sad 0.9%
Confused 0.8%
Calm 0.6%
Disgusted 0.6%

AWS Rekognition

Age 9-17
Gender Female, 59.4%
Calm 57.4%
Sad 39%
Angry 1.2%
Disgusted 0.8%
Surprised 0.8%
Happy 0.4%
Fear 0.3%
Confused 0.1%

AWS Rekognition

Age 54-62
Gender Female, 95.7%
Sad 97.6%
Calm 0.9%
Fear 0.5%
Confused 0.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%
Surprised 0.1%

AWS Rekognition

Age 4-12
Gender Female, 98.2%
Fear 63.2%
Sad 23%
Calm 7.3%
Surprised 4.3%
Disgusted 0.8%
Confused 0.7%
Angry 0.4%
Happy 0.3%

AWS Rekognition

Age 1-7
Gender Female, 62.3%
Calm 54.6%
Angry 34.6%
Sad 4.4%
Confused 4.2%
Disgusted 1.1%
Surprised 0.5%
Happy 0.4%
Fear 0.2%

AWS Rekognition

Age 0-6
Gender Male, 100%
Sad 83.3%
Calm 16.4%
Confused 0.1%
Angry 0.1%
Happy 0%
Fear 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 4-12
Gender Male, 88.5%
Fear 31.9%
Calm 27.9%
Sad 18.9%
Angry 11.8%
Confused 2.9%
Happy 2.4%
Surprised 2.4%
Disgusted 1.7%

AWS Rekognition

Age 20-28
Gender Male, 97.6%
Angry 77.9%
Calm 12.6%
Disgusted 4.1%
Surprised 2.3%
Confused 1.1%
Fear 1.1%
Sad 0.6%
Happy 0.2%

AWS Rekognition

Age 19-27
Gender Male, 91.3%
Sad 90.4%
Calm 7.9%
Happy 0.5%
Fear 0.5%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 29-39
Gender Female, 97.4%
Calm 81.5%
Surprised 13.9%
Angry 1.5%
Sad 1.2%
Disgusted 0.7%
Happy 0.5%
Fear 0.4%
Confused 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Bicycle 92.6%

Categories

Imagga

paintings art 98.6%
people portraits 1.2%

Captions

Text analysis

Amazon

ICE
LAUNDRY
HAND
QUAY
WALTER QUAY
WALTER
225
Cola
COA
7up
KEROSENE
Coca-Cola
225 W3-1ST
REPAIBING
-
LABIEST
W3-1ST
OF
Gente

Google

225 EST WALTER QUAY HAND EPAIRINO LAUNDRY ERIGEN ICE COAL Fela
EST
QUAY
HAND
EPAIRINO
ERIGEN
COAL
Fela
225
WALTER
LAUNDRY
ICE