Human Generated Data

Title

Untitled (family posed in front of foliage and candelabras)

Date

1941

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9942

Human Generated Data

Title

Untitled (family posed in front of foliage and candelabras)

People

Artist: Martin Schweig, American 20th century

Date

1941

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Apparel 99.8
Clothing 99.8
Person 99.6
Human 99.6
Person 99.3
Person 98.9
Person 98.6
Person 98.5
Person 98.2
Person 96.8
Person 95.9
Dress 94
Person 93.1
Face 91.1
Overcoat 87.6
Suit 87.6
Coat 87.6
Female 87
People 86.3
Portrait 66.8
Photo 66.8
Photography 66.8
Woman 66.3
Girl 65.1
Icing 61.9
Dessert 61.9
Cake 61.9
Cream 61.9
Food 61.9
Creme 61.9
Robe 60.5
Fashion 60.5
Costume 60.1
Shorts 58.6
Jacket 58.4
Child 58.4
Kid 58.4
Tuxedo 57
Gown 56.8

Imagga
created on 2022-01-29

nurse 29.4
kin 27.9
people 27.3
male 26.9
man 26.9
person 23.3
adult 22.1
businessman 18.5
professional 18
happy 17.5
men 17.2
medical 16.8
couple 16.5
business 16.4
old 16
worker 15.7
coat 15.7
happiness 14.9
group 14.5
clothing 14.5
lab coat 14.2
portrait 13.6
team 13.4
work 12.5
job 12.4
indoors 12.3
doctor 12.2
patient 12.2
senior 12.2
corporate 12
home 12
two 11.8
groom 11.6
bride 11.5
brass 11.4
looking 11.2
suit 10.8
negative 10.4
women 10.3
day 10.2
camera 10.2
smiling 10.1
smile 10
hospital 9.8
human 9.7
health 9.7
lab 9.7
medicine 9.7
elderly 9.6
love 9.5
clinic 9.4
mature 9.3
wedding 9.2
film 9.2
world 9
wind instrument 9
working 8.8
office 8.8
colleagues 8.7
chemistry 8.7
standing 8.7
chemical 8.7
laboratory 8.7
profession 8.6
biology 8.5
catholic 8.4
company 8.4
teamwork 8.3
emotion 8.3
metropolitan 8.2
businesswoman 8.2
life 8
family 8
musical instrument 7.9
gown 7.9
days 7.8
scientist 7.8
black 7.8
architecture 7.8
color 7.8
assistant 7.8
bouquet 7.7
daytime 7.7
instrument 7.7
serious 7.6
research 7.6
adults 7.6
care 7.4
occupation 7.3
20s 7.3
cheerful 7.3
new 7.3
aged 7.2
dress 7.2
building 7.2
face 7.1

Google
created on 2022-01-29

Photograph 94.2
Dress 83.6
Adaptation 79.3
Suit 78.4
Art 77.9
Snapshot 74.3
Monochrome photography 71
Monochrome 70.9
Event 69.8
Font 68.6
Vintage clothing 67.3
Plant 67.1
Shorts 66.8
Classic 66.7
Uniform 65.7
Room 64.3
History 64.2
Stock photography 63.7
Illustration 62.2
Team 60.4

Microsoft
created on 2022-01-29

person 99.2
clothing 96.8
text 95.6
outdoor 91.6
posing 87.3
man 84.3
group 67.2
wedding 55.3
old 43.2

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 100%
Calm 94.3%
Happy 5.1%
Sad 0.1%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Male, 97.5%
Happy 63.9%
Calm 15.8%
Surprised 7.1%
Fear 5.9%
Sad 3.8%
Confused 1.7%
Disgusted 1.2%
Angry 0.6%

AWS Rekognition

Age 18-26
Gender Male, 99%
Happy 83.1%
Calm 6.8%
Surprised 3.5%
Fear 1.9%
Sad 1.8%
Disgusted 1.2%
Confused 1%
Angry 0.8%

AWS Rekognition

Age 20-28
Gender Male, 97.9%
Surprised 45.6%
Happy 25.5%
Calm 19.9%
Confused 3.1%
Sad 1.9%
Angry 1.7%
Fear 1.1%
Disgusted 1%

AWS Rekognition

Age 34-42
Gender Male, 99.9%
Calm 66%
Happy 24.7%
Sad 4.6%
Confused 1.6%
Surprised 1.3%
Angry 0.7%
Disgusted 0.6%
Fear 0.5%

AWS Rekognition

Age 24-34
Gender Male, 98.3%
Happy 99.5%
Surprised 0.2%
Fear 0.1%
Sad 0.1%
Disgusted 0%
Calm 0%
Angry 0%
Confused 0%

AWS Rekognition

Age 29-39
Gender Female, 62%
Surprised 64.7%
Happy 29.5%
Fear 3.1%
Calm 1.1%
Sad 0.8%
Angry 0.4%
Disgusted 0.3%
Confused 0.1%

AWS Rekognition

Age 23-33
Gender Male, 89%
Calm 95.7%
Sad 1.8%
Happy 0.9%
Fear 0.4%
Disgusted 0.3%
Confused 0.3%
Angry 0.3%
Surprised 0.2%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Happy 99.7%
Confused 0.1%
Surprised 0.1%
Calm 0.1%
Disgusted 0%
Sad 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people posing for a photo 95.9%
a group of people posing for the camera 95.8%
a group of people posing for a picture 95.7%

Text analysis

Amazon

3
MJIF
MJIF YEER ARAA
YEER
ARAA

Google

A70A
MJI7 YT33A2 A70A
MJI7
YT33A2