Human Generated Data

Title

Ham Slicer

Date

1949

People

Artist: George Heyer, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Daniel Bell, P1979.54

Human Generated Data

Title

Ham Slicer

People

Artist: George Heyer, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Daniel Bell, P1979.54

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Chef 99.3
Person 98.4
Person 77.1
Person 72.3
Finger 67.2

Clarifai
created on 2023-10-25

people 99.9
monochrome 99.5
man 98.3
adult 96.8
portrait 95.8
documentary 94.5
child 93.3
two 92.5
group 92.2
woman 88.9
street 86.8
sit 83.1
group together 82.7
three 81.9
lid 81
wear 79.4
furniture 76.2
uniform 75.9
boy 74.8
family 74.2

Imagga
created on 2021-12-14

television 41.8
telecommunication system 32.5
man 27.5
people 25.1
male 24.9
person 24.1
adult 20
black 18.7
portrait 17.5
love 15
business 14.6
men 13.7
lifestyle 13.7
couple 13.1
face 12.8
businessman 12.4
office 11.4
looking 11.2
sitting 11.2
casual 11
world 10.8
suit 10.8
happy 10.6
attractive 10.5
one 10.4
home 10.4
fire 10.3
indoor 10
leisure 10
silhouette 9.9
handsome 9.8
human 9.7
happiness 9.4
youth 9.4
alone 9.1
fireplace 8.9
smile 8.5
tie 8.5
studio 8.4
20s 8.2
computer 8.2
style 8.2
light 8.1
bride 8
day 7.8
25 30 years 7.8
groom 7.7
book 7.7
corporate 7.7
problem 7.7
outdoor 7.6
hand 7.6
adults 7.6
relaxation 7.5
drink 7.5
clothing 7.5
manager 7.4
smiling 7.2
music 7.2
art 7.2
romantic 7.1
women 7.1
working 7.1
indoors 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

clothing 98.9
text 98.6
man 94.9
monitor 93.5
television 92.7
person 85.6
human face 83.5
screen 80
hat 76.9
black and white 52.7
dish 40.3
picture frame 9.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-39
Gender Male, 53.4%
Calm 90.1%
Sad 5.9%
Surprised 2.1%
Confused 0.7%
Fear 0.5%
Happy 0.3%
Angry 0.3%
Disgusted 0.1%

AWS Rekognition

Age 37-55
Gender Female, 81.3%
Calm 95.9%
Sad 1.4%
Happy 1.3%
Angry 0.5%
Confused 0.4%
Surprised 0.4%
Fear 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

paintings art 88.7%
food drinks 6.1%
interior objects 4.1%