Human Generated Data

Title

Untitled (family portrait on couch in living room)

Date

1965

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17619

Human Generated Data

Title

Untitled (family portrait on couch in living room)

People

Artist: Lucian and Mary Brown, American

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Person 99.5
Person 99.2
Person 98.4
Person 98.2
Person 97.7
Person 96.4
Person 96
Person 94.5
Sitting 89.2
Lamp 86.3
Table Lamp 86.3
Suit 78.1
Coat 78.1
Overcoat 78.1
Apparel 78.1
Clothing 78.1
People 70.2
Plant 68.1
Blossom 67.9
Flower 67.9
Furniture 66.6
Table 66.3
Person 63.4
Flower Arrangement 58.5
Photography 56.9
Portrait 56.9
Face 56.9
Photo 56.9
Indoors 56.1
Room 55.9

Imagga
created on 2022-02-26

man 40.3
male 34.8
person 34.8
people 33.5
adult 30.8
smiling 26
businessman 25.6
business 25.5
professional 24.4
happy 23.2
home 23.1
waiter 23
worker 22.9
meeting 22.6
room 21.7
men 21.5
team 20.6
cheerful 20.3
table 20
indoors 19.3
couple 19.2
businesspeople 19
employee 18.7
businesswoman 18.2
teacher 18
office 17.7
women 17.4
salon 16.9
group 16.9
mature 16.7
colleagues 16.5
20s 16.5
desk 16.1
work 15.9
smile 15.7
happiness 15.7
30s 15.4
dining-room attendant 15.3
talking 15.2
lifestyle 15.2
working 15
indoor 14.6
clinic 14.1
sitting 13.7
nurse 13.5
family 13.3
together 13.1
senior 13.1
teamwork 13
interior 12.4
portrait 12.3
life 12.2
corporate 12
casual 11.9
patient 11.8
executive 11.8
handsome 11.6
educator 11.6
computer 11.2
job 10.6
medical 10.6
modern 10.5
enjoying 10.4
two 10.2
camera 10.2
hospital 10
holding 9.9
associates 9.8
mother 9.8
to 9.7
cooperation 9.7
standing 9.6
board 9.4
communication 9.2
drink 9.2
laptop 9.1
human 9
kitchen 8.9
coworkers 8.8
discussing 8.8
chair 8.8
partner 8.7
four 8.6
face 8.5
document 8.5
doctor 8.5
attractive 8.4
clothing 8.4
horizontal 8.4
coat 8.4
color 8.3
confident 8.2
suit 8.1
restaurant 8
conference 7.8
40s 7.8
mid adult 7.7
adults 7.6
shop 7.6
fun 7.5
manager 7.5
meal 7.4
wine 7.4
classroom 7.3
new 7.3

Google
created on 2022-02-26

Photograph 94.3
Table 90.2
Coat 89.4
Chair 85.1
Black-and-white 84.7
Style 83.9
Suit 81.1
Monochrome photography 75.4
Vintage clothing 75.3
Window 75.2
Snapshot 74.3
Monochrome 73.8
Event 72.4
White-collar worker 70.4
Room 68.5
Classic 66
Team 64.7
History 64.6
Stock photography 64.1
Desk 62.7

Microsoft
created on 2022-02-26

person 99.4
clothing 93
text 91.5
indoor 91.3
window 90.6
man 81.6
group 79.9
old 73
people 66.6
posing 60.3
dining table 16.2

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 78.7%
Calm 54.7%
Happy 19.8%
Surprised 17.8%
Fear 3.1%
Sad 1.5%
Disgusted 1.5%
Confused 1%
Angry 0.6%

AWS Rekognition

Age 34-42
Gender Male, 91.7%
Calm 47.4%
Happy 33%
Sad 8.1%
Surprised 3.8%
Disgusted 2.7%
Fear 2.5%
Angry 1.4%
Confused 1.1%

AWS Rekognition

Age 48-54
Gender Male, 87.6%
Happy 49.1%
Calm 30.6%
Surprised 14.2%
Fear 2.4%
Disgusted 1%
Confused 1%
Sad 0.9%
Angry 0.9%

AWS Rekognition

Age 49-57
Gender Male, 94.7%
Happy 69.9%
Surprised 18%
Sad 3.7%
Calm 2.9%
Confused 2.5%
Angry 1.4%
Disgusted 1.1%
Fear 0.4%

AWS Rekognition

Age 36-44
Gender Male, 98.1%
Calm 73.7%
Sad 11.2%
Happy 8.3%
Disgusted 1.9%
Angry 1.9%
Surprised 1.2%
Confused 1.1%
Fear 0.7%

AWS Rekognition

Age 49-57
Gender Male, 83.8%
Calm 99%
Happy 0.6%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 37-45
Gender Male, 94.4%
Calm 64.8%
Happy 18.2%
Angry 7.7%
Sad 5.6%
Confused 1.3%
Disgusted 1%
Surprised 0.7%
Fear 0.6%

AWS Rekognition

Age 35-43
Gender Female, 86%
Surprised 82%
Happy 4.9%
Calm 4.8%
Confused 2.4%
Fear 2.2%
Angry 1.4%
Sad 1.3%
Disgusted 1.1%

AWS Rekognition

Age 37-45
Gender Male, 99.8%
Sad 43.5%
Surprised 25.3%
Happy 16.3%
Calm 7.2%
Confused 2.6%
Disgusted 2%
Fear 1.6%
Angry 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people posing for a photo 96.7%
a group of people posing for the camera 96.6%
a group of people posing for a picture 96.5%

Text analysis

Amazon

17
MJ17--YT37A°2
MJ17--YT37A°2 KOOK
KOOK

Google

17 MJI7--YT37A°2 XAaO
XAaO
17
MJI7--YT37A°2