Human Generated Data

Title

Untitled (Ezra Shahn, New York City)

Date

1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4234

Human Generated Data

Title

Untitled (Ezra Shahn, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4234

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.1
Human 98.1
Person 96.9
Person 96.5
Person 92.6
Person 91.3
People 83.6
Porch 68.2
Military 58
Clothing 55.2
Apparel 55.2

Clarifai
created on 2023-10-25

people 99.5
movie 99.4
negative 98.7
slide 98.2
vintage 98.2
filmstrip 97.7
group 97.6
wear 97.3
collage 97.3
retro 96.8
adult 96.8
picture frame 96.2
margin 95.4
old 95.2
two 94.7
art 93.8
woman 92.5
sepia 92
cinematography 91.2
man 90.8

Imagga
created on 2022-01-08

musical instrument 38.8
brass 37
architecture 35.1
wind instrument 31.9
history 31.3
sculpture 30.8
building 27.9
old 27.9
monument 26.1
statue 25.1
ancient 25.1
stone 24.5
historic 23.8
tourism 23.1
culture 23.1
travel 22.5
art 21.2
military uniform 20.2
landmark 19.9
religion 19.7
facade 18.8
historical 18.8
uniform 18.4
column 16.5
city 15.8
wall 15.5
window 15
famous 14.9
temple 13.9
church 13.9
marble 13.7
clothing 11.9
cornet 11.5
exterior 11.1
tourist 11
antique 10.6
percussion instrument 10.4
sky 10.2
town 10.2
balcony 9.7
detail 9.6
god 9.6
symbol 9.4
structure 9.4
palace 9.4
accordion 9.2
carving 9.2
device 9.1
memorial 8.8
house 8.8
marimba 8.7
arch 8.7
roof 8.6
keyboard instrument 8.4
covering 8.1
consumer goods 8.1
day 7.8
heritage 7.7
classical 7.6
capital 7.6
buildings 7.6
destination 7.5
people 7.2
decoration 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

clothing 97.1
person 96.7
indoor 94.2
text 87.2
man 84.5
old 70.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 97.8%
Calm 63.1%
Sad 25.2%
Fear 3.2%
Angry 2.2%
Confused 2.1%
Disgusted 2%
Happy 1.4%
Surprised 0.8%

AWS Rekognition

Age 37-45
Gender Male, 73.4%
Calm 68.8%
Sad 19.5%
Happy 8.2%
Confused 1.8%
Disgusted 0.6%
Angry 0.5%
Fear 0.3%
Surprised 0.3%

AWS Rekognition

Age 25-35
Gender Male, 99.7%
Calm 87.5%
Sad 4.6%
Happy 2.6%
Disgusted 1.8%
Fear 1.5%
Angry 1%
Surprised 0.6%
Confused 0.5%

AWS Rekognition

Age 1-7
Gender Female, 81.3%
Calm 68.5%
Confused 9.7%
Angry 6.2%
Happy 4.4%
Sad 4%
Surprised 3%
Disgusted 2.8%
Fear 1.4%

AWS Rekognition

Age 34-42
Gender Male, 99.5%
Calm 61.4%
Happy 30.6%
Confused 3%
Disgusted 2.1%
Sad 1.5%
Surprised 0.6%
Angry 0.5%
Fear 0.3%

AWS Rekognition

Age 35-43
Gender Male, 82.7%
Calm 33.2%
Disgusted 21.2%
Fear 15.3%
Angry 9.7%
Happy 7.3%
Sad 6.6%
Surprised 3.6%
Confused 3.2%

AWS Rekognition

Age 50-58
Gender Male, 69.3%
Sad 38.4%
Happy 35.6%
Confused 9%
Calm 8.7%
Disgusted 2.7%
Surprised 2.7%
Angry 1.6%
Fear 1.2%

AWS Rekognition

Age 19-27
Gender Female, 56.5%
Calm 97.4%
Confused 1.5%
Angry 0.5%
Sad 0.3%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Male, 67.4%
Fear 27.9%
Sad 27.1%
Calm 14.9%
Happy 12.4%
Confused 5.6%
Disgusted 4.9%
Surprised 3.6%
Angry 3.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%

Categories

Imagga

interior objects 99.8%

Captions

Microsoft
created on 2022-01-08

an old photo of a train 49.5%
old photo of a train 45.2%
an old photo of a train station 45.1%

Text analysis

Google

BEJ 3 0
BEJ
3
0