Human Generated Data

Title

Man Attaching Pearls

Date

1979-1983

People

Artist: Mary E. Frey, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.648

Copyright

© Mary E. Frey

Human Generated Data

Title

Man Attaching Pearls

People

Artist: Mary E. Frey, American 20th century

Date

1979-1983

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.648

Copyright

© Mary E. Frey

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Person 99.7
Human 99.7
Person 99.7
Table Lamp 98.7
Furniture 95.4
Lamp 94.4
Bed 77.2
Indoors 64.9
Bedroom 57.6
Room 57.6

Clarifai
created on 2021-04-03

people 99.7
two 99.5
portrait 98.6
adult 98.4
man 98.2
woman 97.8
family 96.4
couple 95.4
monochrome 94.4
room 93.8
three 93.5
group 89.8
retro 87.3
love 87.2
indoors 82.8
facial expression 80.9
wear 80.9
nostalgia 80.7
elderly 80.7
girl 80.5

Imagga
created on 2021-04-03

nurse 41.7
man 37
person 33
male 32.2
people 31.8
adult 23.5
men 22.3
indoors 19.3
smiling 18.8
medical 18.5
happy 18.2
patient 17.9
home 17.5
professional 17.4
happiness 15.7
women 15
health 14.6
face 14.2
couple 13.9
black 13.8
sitting 13.7
portrait 13.6
human 13.5
family 13.3
one 12.7
doctor 12.2
occupation 11.9
room 11.6
lifestyle 11.6
holding 11.6
cheerful 11.4
love 11
hospital 11
child 10.8
care 10.7
hand 10.6
attractive 10.5
brunette 10.5
office 10.4
senior 10.3
dress 9.9
fashion 9.8
job 9.7
businessman 9.7
clothing 9.6
standing 9.6
husband 9.5
wife 9.5
casual 9.3
handsome 8.9
coat 8.8
work 8.7
mother 8.6
house 8.4
inside 8.3
alone 8.2
grandfather 8.1
looking 8
to 8
medicine 7.9
business 7.9
smile 7.8
mid adult 7.7
30s 7.7
parent 7.7
father 7.6
illness 7.6
case 7.6
leisure 7.5
groom 7.4
clinic 7.3
dad 7.3
worker 7.2
body 7.2
hair 7.1
working 7.1

Google
created on 2021-04-03

Microsoft
created on 2021-04-03

wall 99.5
person 94.7
text 93.4
man 90.4
clothing 89.2
smile 79.8
human face 74
black and white 67.2
posing 65.3
old 42

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 44-62
Gender Female, 97.2%
Calm 95.4%
Sad 3.1%
Fear 0.3%
Disgusted 0.3%
Confused 0.3%
Angry 0.3%
Happy 0.1%
Surprised 0.1%

AWS Rekognition

Age 45-63
Gender Male, 90.7%
Sad 48%
Calm 47.7%
Confused 1.4%
Happy 0.7%
Fear 0.7%
Surprised 0.6%
Disgusted 0.6%
Angry 0.4%

Microsoft Cognitive Services

Age 61
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Lamp 94.4%

Categories

Imagga

people portraits 91.3%
paintings art 6.7%

Text analysis

Amazon

alter
a
y