Human Generated Data

Title

Untitled (Bertha and Harold)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2911

Human Generated Data

Title

Untitled (Bertha and Harold)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2911

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 100
Apparel 100
Coat 99.6
Overcoat 99.5
Person 97.6
Human 97.6
Person 95
Trench Coat 93.7
Shoe 93.3
Footwear 93.3
Boot 61.6

Clarifai
created on 2023-10-25

people 99.5
two 99.1
man 97.7
portrait 97.6
child 97.2
wear 97.1
love 96.6
woman 96.4
sepia 96.4
wedding 95.4
son 93.5
affection 92.5
retro 92.5
couple 92.3
adult 91.9
coat 90.9
girl 90.4
vintage 87.5
nostalgia 87.2
fall 86

Imagga
created on 2022-01-08

military uniform 35.2
uniform 32.7
clothing 30.5
adult 29.1
trench coat 29.1
man 29
coat 28.3
people 27.3
person 26.8
raincoat 25.2
male 23.4
portrait 23.3
helmet 19.5
attractive 18.2
fashion 18.1
walking 17
lifestyle 16.6
smile 16.4
casual 16.1
looking 16
happy 15.7
suit 15.5
street 14.7
business 14.6
women 13.4
city 13.3
pretty 13.3
holding 13.2
standing 12.2
jacket 12
armor plate 11.8
smiling 11.6
walk 11.4
urban 11.4
black 11.2
happiness 11
face 10.6
businessman 10.6
brunette 10.5
one 10.4
consumer goods 10.1
handsome 9.8
human 9.7
lady 9.7
fun 9.7
outdoors 9.7
autumn 9.7
expression 9.4
outdoor 9.2
active 9
family 8.9
covering 8.8
hair 8.7
couple 8.7
boy 8.7
fashionable 8.5
shirt 8.4
garment 8.3
alone 8.2
confident 8.2
group 8.1
sexy 8
posing 8
child 7.9
model 7.8
men 7.7
professional 7.7
youth 7.7
bag 7.5
life 7.4
20s 7.3
success 7.2
office 7.2
cute 7.2
cool 7.1
to 7.1
season 7
together 7
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99.8
clothing 98.4
outdoor 98.3
text 96.8
standing 94.5
coat 90.7
smile 85.3
posing 83.2
human face 72.7
man 65.9
retro 64.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Male, 100%
Calm 65.2%
Confused 23.2%
Angry 4.5%
Sad 3%
Happy 2.8%
Disgusted 0.5%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 21-29
Gender Female, 98.3%
Happy 99.6%
Angry 0.1%
Calm 0.1%
Surprised 0.1%
Confused 0.1%
Sad 0%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 24
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Coat 99.6%
Person 97.6%
Shoe 93.3%

Categories