Human Generated Data

Title

Untitled ("Beauty in the Making")

Date

c. 1931

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1917

Human Generated Data

Title

Untitled ("Beauty in the Making")

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1931

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 97.7
Human 97.7
Clothing 97.2
Apparel 97.2
Person 97
Person 96.8
Person 94.9
Person 94.3
Person 93.8
Person 92.3
Person 90.1
Person 85.7
Person 74.1
Person 70.1
People 69.6
Person 69.6
Clinic 69.1
Person 63
Gown 61.1
Fashion 61.1
Robe 56.6
Wedding 55.8

Imagga
created on 2021-12-14

negative 56.3
film 50.6
photographic paper 37.6
groom 33
photographic equipment 25.1
people 21.7
person 20.2
wedding 17.5
table 15.7
man 15.4
work 14.9
medical 14.1
technology 14.1
bride 13.6
celebration 13.6
medicine 13.2
glass 12.9
human 12.7
working 12.4
flowers 12.2
bouquet 12.1
male 12.1
party 12
decoration 11.9
adult 11.9
ceremony 11.6
science 11.6
equipment 11.5
hand 11.4
couple 11.3
love 11
business 10.9
businessman 10.6
office 10.4
health 10.4
men 10.3
women 10.3
event 10.2
drink 10
holding 9.9
care 9.9
team 9.9
professional 9.8
test 9.6
doctor 9.4
hospital 9.4
rose 9.4
worker 9.1
wed 8.8
indoors 8.8
symbol 8.7
lab 8.7
laboratory 8.7
setting 8.7
day 8.6
happiness 8.6
marriage 8.5
wife 8.5
glasses 8.3
happy 8.1
dress 8.1
suit 8.1
romance 8
smiling 8
dinner 7.9
reception 7.8
napkin 7.8
black 7.8
education 7.8
knife 7.7
flower 7.7
husband 7.6
research 7.6
chair 7.6
meeting 7.5
room 7.4
group 7.2
television 7.2
home 7.2
romantic 7.1
portrait 7.1
together 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.6
window 93.1
person 91
clothing 90.3
wedding dress 86.4
woman 76.9
group 68.1
bride 62.5
dance 60.6
posing 57
dress 53.2
old 52.1

Face analysis

Amazon

AWS Rekognition

Age 29-45
Gender Female, 74.5%
Calm 73.4%
Happy 20.3%
Sad 2.1%
Surprised 1.4%
Confused 0.9%
Angry 0.9%
Disgusted 0.7%
Fear 0.3%

AWS Rekognition

Age 21-33
Gender Female, 66.4%
Happy 83.6%
Calm 6.2%
Sad 3.2%
Fear 2.6%
Angry 1.7%
Surprised 1.6%
Confused 0.8%
Disgusted 0.3%

AWS Rekognition

Age 23-37
Gender Male, 92.5%
Calm 84.6%
Sad 8.1%
Angry 2.5%
Happy 1.7%
Surprised 1.2%
Fear 1.1%
Confused 0.7%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Female, 53.2%
Happy 36.6%
Calm 35.3%
Sad 25%
Confused 1.1%
Fear 0.7%
Angry 0.5%
Surprised 0.5%
Disgusted 0.2%

AWS Rekognition

Age 23-37
Gender Male, 61.9%
Happy 87.8%
Calm 9.2%
Sad 1.9%
Confused 0.3%
Surprised 0.3%
Angry 0.2%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 14-26
Gender Female, 68.9%
Calm 95.1%
Angry 1.7%
Happy 1.3%
Confused 1.2%
Sad 0.5%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%

Feature analysis

Amazon

Person 97.7%

Captions

Microsoft

a group of people posing for a photo 80.8%
a group of people posing for a photo in front of a window 73.8%
a group of people posing for the camera 73.7%

Text analysis

Amazon

rue
rue Story
Story
()

Google

TueS or
TueS
or