Human Generated Data

Title

Untitled (family in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17144

Human Generated Data

Title

Untitled (family in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.1
Human 99.1
Person 99
Apparel 98.9
Clothing 98.9
Person 97.4
Person 92.5
Shorts 85.6
Female 84.9
Art 78.7
Person 77.4
Person 70.3
Woman 70.2
Drawing 68.1
Person 67.9
Dress 66
Skirt 65.1
Portrait 63.2
Photo 63.2
Photography 63.2
Face 63.2
Coat 60.1
People 59.5
Sketch 55.4
Person 52.2

Imagga
created on 2022-02-26

businessman 39.8
business 37.1
people 35.7
man 32.9
office 31.5
male 30.6
corporate 29.2
professional 27.3
person 26.7
adult 26.3
meeting 25.5
outfit 25.2
work 23.6
group 22.6
team 22.4
happy 21.9
worker 20.5
businesswoman 20
women 19.8
teamwork 19.5
businesspeople 19
men 18.9
suit 17.5
groom 16.4
handsome 16.1
job 15.9
attractive 15.4
success 15.3
smile 15
successful 14.7
clothing 14.5
smiling 14.5
executive 14.2
laptop 13.7
portrait 13.6
ethnic 13.3
modern 13.3
desk 13.2
manager 13
pretty 12.6
talking 12.4
room 12.2
company 12.1
building 12
indoor 11.9
jacket 11.8
colleagues 11.7
bride 11.6
career 11.4
computer 11.3
boutique 11.1
communication 10.9
corporation 10.6
working 10.6
diversity 10.6
indoors 10.5
new 10.5
together 10.5
couple 10.5
looking 10.4
table 10.4
two 10.2
conference 9.8
businessmen 9.8
boss 9.6
standing 9.6
formal 9.6
happiness 9.4
finance 9.3
confident 9.1
black 9
human 9
associates 8.9
interior 8.8
day 8.6
workplace 8.6
adults 8.5
casual 8.5
color 8.3
hall 7.9
attire 7.8
diverse 7.8
hands 7.8
sitting 7.7
partnership 7.7
profession 7.7
hand 7.6
walking 7.6
fashion 7.5
presentation 7.4
employee 7.3
dress 7.2
lifestyle 7.2
home 7.2
medical 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.5
dress 95
clothing 92.8
person 86
woman 85.5
footwear 64.8
posing 54.4

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Calm 76.9%
Sad 17.5%
Confused 3.3%
Angry 1.1%
Surprised 0.4%
Disgusted 0.3%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Male, 99.5%
Calm 98.4%
Sad 0.4%
Angry 0.3%
Disgusted 0.3%
Confused 0.2%
Surprised 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 38-46
Gender Female, 99.3%
Calm 78%
Sad 19.2%
Happy 1%
Confused 0.4%
Angry 0.4%
Surprised 0.4%
Disgusted 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Captions

Microsoft

a group of people posing for a photo 80.2%
a group of people posing for the camera 80.1%
a group of people posing for a picture 80%

Text analysis

Amazon

21
TOA
KOOK
YE33AB KOOK
YE33AB

Google

YT3RA2-
TOA YT3RA2-
TOA