Human Generated Data

Title

Untitled (family portrait outdoors with trees)

Date

c.1950

People

Artist: Mary Lowber Tiers, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15725

Human Generated Data

Title

Untitled (family portrait outdoors with trees)

People

Artist: Mary Lowber Tiers, American active 1940s

Date

c.1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.7
Human 99.7
Person 99.7
Person 99.2
Apparel 92.2
Clothing 92.2
Face 78.3
Person 74.7
People 74.6
Food 65.8
Meal 65.8
Outdoors 65
Person 64.9
Tree 62.9
Plant 62.9
Furniture 60.5
Photo 60.1
Photography 60.1
Soil 59
Art 57.1
Military Uniform 55.9
Military 55.9

Imagga
created on 2022-02-05

newspaper 36.7
man 36.3
product 28.3
people 27.9
kin 26.2
male 25.5
couple 23.5
adult 23.5
groom 22.8
person 22.4
creation 22
happy 21.3
smiling 21
computer 20.9
sitting 20.6
home 18.3
business 18.2
laptop 18.2
women 18.2
smile 17.8
mother 17.7
men 16.3
businessman 15.9
together 15.8
office 15.4
casual 15.2
working 15
senior 15
portrait 14.9
indoors 14
two 13.5
attractive 13.3
lifestyle 13
work 12.6
happiness 12.5
bride 12.5
desk 12.3
mature 12.1
looking 12
love 11.8
color 11.1
indoor 10.9
businesswoman 10.9
worker 10.7
modern 10.5
group 10.5
notebook 10.5
talking 10.4
parent 10.3
outdoors 9.9
cheerful 9.7
job 9.7
technology 9.6
30s 9.6
elderly 9.6
day 9.4
wedding 9.2
pretty 9.1
family 8.9
grandma 8.8
colleagues 8.7
table 8.6
marriage 8.5
keyboard 8.4
black 8.4
communication 8.4
leisure 8.3
holding 8.2
handsome 8
clothing 7.9
look 7.9
mid adult 7.7
tree 7.7
old 7.7
loving 7.6
life 7.6
enjoying 7.6
room 7.5
meeting 7.5
friends 7.5
one 7.5
aged 7.2
team 7.2
professional 7.1
face 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99
person 96.8
clothing 94.1
man 90.7

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 72.7%
Calm 69.9%
Sad 19.2%
Fear 3.5%
Angry 2.3%
Confused 2%
Happy 1.6%
Disgusted 0.9%
Surprised 0.5%

AWS Rekognition

Age 50-58
Gender Female, 99.8%
Happy 86.2%
Calm 4.1%
Fear 3.8%
Surprised 3.2%
Confused 0.9%
Angry 0.7%
Sad 0.6%
Disgusted 0.5%

AWS Rekognition

Age 38-46
Gender Female, 58.4%
Calm 64.8%
Confused 11.8%
Surprised 10.9%
Sad 6.1%
Happy 3.6%
Angry 1.2%
Disgusted 1%
Fear 0.5%

AWS Rekognition

Age 29-39
Gender Female, 99.9%
Calm 99.9%
Sad 0%
Angry 0%
Happy 0%
Confused 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 66.8%
Calm 99.9%
Sad 0%
Confused 0%
Surprised 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people sitting at a table 72.3%
a group of people sitting around a table 72.2%
a group of people posing for a photo 72.1%