Human Generated Data

Title

Untitled (men, women, and babies in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16529

Human Generated Data

Title

Untitled (men, women, and babies in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-12

Apparel 99.4
Clothing 99.4
Person 99.2
Human 99.2
Person 99
Person 97.9
Person 97.2
Person 96.8
Person 96.5
Person 94.6
Person 89.1
Clinic 88.4
Furniture 80.6
Coat 74.3
Overcoat 73.8
Suit 73.8
Female 67.6
Leisure Activities 65.3
People 65.3
Robe 59.5
Fashion 59.5
Gown 58.3
Chair 58.1
Lab Coat 57.6
Long Sleeve 57
Sleeve 57
Shorts 55
Person 48.7

Imagga
created on 2022-02-12

person 42.7
man 41
people 33.5
male 31.9
businessman 31.8
business 28.5
adult 27.9
professional 24.3
team 24.2
nurse 23.5
office 23.3
kin 22.3
women 21.3
businesswoman 20
meeting 19.8
men 19.7
work 19.7
group 19.3
businesspeople 19
job 18.6
room 18.5
happy 17.5
worker 17
indoors 16.7
colleagues 16.5
corporate 16.3
desk 16
smiling 15.9
working 15.9
manager 15.8
teamwork 15.8
planner 15.6
portrait 14.9
patient 14.2
together 14
grandfather 13.4
communication 13.4
medical 13.2
health 13.2
lifestyle 13
cheerful 13
table 13
smile 12.8
suit 12.7
30s 12.5
talking 12.4
couple 12.2
executive 12.1
sitting 12
human 12
20s 11.9
discussion 11.7
interior 11.5
life 11.5
modern 11.2
mature 11.2
casual 11
laptop 10.9
associates 10.8
success 10.5
standing 10.4
computer 10.4
looking 10.4
happiness 10.2
two 10.2
hospital 10
mid adult 9.6
home 9.6
indoor 9.1
brass 9.1
confident 9.1
drawing 8.9
discussing 8.8
conference 8.8
day 8.6
color 8.3
holding 8.3
successful 8.2
wind instrument 8.1
handsome 8
medicine 7.9
bright 7.9
education 7.8
40s 7.8
clinic 7.8
partners 7.8
student 7.7
cooperation 7.7
teacher 7.7
partnership 7.7
four 7.7
adults 7.6
doctor 7.5
senior 7.5
silhouette 7.4
black 7.2
chair 7.1

Google
created on 2022-02-12

Black 89.5
Picture frame 88.5
Coat 88
Black-and-white 84.4
Style 83.8
Curtain 80.1
Vintage clothing 76.4
Monochrome photography 75.7
Monochrome 74.4
Art 71.7
Event 70.6
Classic 69.3
Chair 68.9
Suit 67.9
Window 67.4
Room 66
History 64.3
Stock photography 62.7
Sitting 60.1
Team 58.7

Microsoft
created on 2022-02-12

text 98
person 97.3
clothing 96.2
man 87.7
standing 81.6
posing 76.1
old 70.9
group 56.7

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 69.1%
Sad 56.6%
Calm 38.4%
Happy 2.4%
Confused 1.9%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Male, 90.6%
Calm 90.4%
Happy 8.4%
Sad 0.8%
Fear 0.2%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 29-39
Gender Female, 54.1%
Happy 92.7%
Surprised 4.8%
Calm 1.1%
Angry 0.4%
Sad 0.3%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 42-50
Gender Male, 99.2%
Calm 78.2%
Confused 6.9%
Happy 4.9%
Sad 3.9%
Fear 3.2%
Surprised 1.5%
Angry 0.8%
Disgusted 0.6%

AWS Rekognition

Age 29-39
Gender Male, 57.8%
Happy 84.3%
Calm 6.6%
Fear 4.5%
Sad 3.3%
Surprised 0.6%
Angry 0.4%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 25-35
Gender Female, 97.2%
Calm 82.4%
Surprised 6.4%
Fear 3.3%
Sad 2.8%
Disgusted 2.2%
Happy 1.3%
Angry 1%
Confused 0.5%

AWS Rekognition

Age 28-38
Gender Male, 97.5%
Happy 34.4%
Fear 19%
Calm 17.5%
Sad 12.5%
Surprised 11.6%
Angry 2%
Confused 1.6%
Disgusted 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a group of people posing for a photo 93%
a group of people standing in front of a window 89.4%
an old photo of a group of people posing for the camera 89.3%

Text analysis

Amazon

،

Google

-
YT37A°2
XAGON
MJI3-- YT37A°2 - - XAGON
MJI3--