Human Generated Data

Title

Untitled (Ben Shahn and Arthur Rothstein)

Date

c. 1936

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1998.138

Human Generated Data

Title

Untitled (Ben Shahn and Arthur Rothstein)

People

Artist: Unidentified Artist,

Date

c. 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1998.138

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Art 97.5
Person 96.1
Human 96.1
Person 94.2
Drawing 90.6
Sketch 84.3
Painting 66.9
Person 64
Clothing 61.4
Apparel 61.4
Art Gallery 60.7
Canvas 58.6

Clarifai
created on 2023-10-25

people 99.8
man 98.8
group 98.7
painting 97.6
adult 97.4
family 96.7
art 96.4
portrait 96.1
two 96
leader 94.4
three 92.5
woman 91.7
room 91.3
museum 90.3
actor 88.5
offspring 87.1
retro 86.3
music 85.1
wear 84.1
exhibition 82.3

Imagga
created on 2021-12-15

groom 47.7
people 19
dress 16.3
black 16.2
person 15.9
couple 14.8
window 14.4
man 14.1
art 13.8
old 13.2
sculpture 13
portrait 12.9
statue 12
celebration 12
happy 11.9
kin 11.8
love 11.8
adult 11.8
bride 11.8
elevator 11.5
happiness 11
face 10.6
room 10.4
luxury 10.3
women 10.3
decoration 10.2
light 10
male 10
religion 9.8
fashion 9.8
hair 9.5
smiling 9.4
architecture 9.4
future 9.3
lifting device 9.2
wedding 9.2
style 8.9
family 8.9
interior 8.8
sexy 8.8
screen 8.8
home 8.8
device 8.7
party 8.6
glass 8.6
elegance 8.4
color 8.3
indoor 8.2
office 8.1
clothing 8.1
group 8.1
posing 8
business 7.9
child 7.8
covering 7.8
model 7.8
two 7.6
bouquet 7.5
historical 7.5
house 7.5
dark 7.5
traditional 7.5
monument 7.5
building 7.4
historic 7.3
girls 7.3
body 7.2
holiday 7.2
smile 7.1
romantic 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

drawing 97.1
text 95.9
clothing 94.6
person 87.7
gallery 77.9
sketch 64.3
painting 61.5
picture frame 12.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-51
Gender Male, 96.7%
Calm 43%
Sad 28.4%
Confused 7.8%
Angry 7.7%
Happy 6.7%
Surprised 2.7%
Fear 2.1%
Disgusted 1.5%

AWS Rekognition

Age 31-47
Gender Male, 92.4%
Calm 72.8%
Sad 20.6%
Happy 2.3%
Confused 1.6%
Angry 1.5%
Surprised 0.5%
Fear 0.3%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.1%

Categories

Captions

Microsoft
created on 2021-12-15

an old photo of a person 49.2%
an old photo of a person 49.1%
old photo of a person 44.6%

Text analysis

Amazon

10

Google

AU 10
AU
10