Human Generated Data

Title

Untitled (family portrait in living room)

Date

c. 1960

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21600

Human Generated Data

Title

Untitled (family portrait in living room)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Apparel 99.4
Clothing 99.4
Human 99.3
Person 99.3
Person 99.3
Person 99
Person 98.7
Person 98.4
Person 97.8
Person 97.8
Person 97.5
Person 96.9
Coat 91.6
Overcoat 91.6
Suit 91.6
Person 84.2
People 82.8
Shoe 81.2
Footwear 81.2
Furniture 78.3
Chair 78.3
Clinic 78.1
Dress 68.9
Table 68.9
Female 67.8
Sleeve 66.1
Long Sleeve 66.1
Shoe 65
Robe 63.7
Evening Dress 63.7
Gown 63.7
Fashion 63.7
Tuxedo 62.4
Indoors 59.8
Room 58.7
Photography 57.1
Photo 57.1
Sitting 56

Imagga
created on 2022-03-05

person 53.2
man 41
teacher 40.3
professional 39.5
male 39
businessman 36.2
office 35.3
people 35.2
business 32.8
adult 31.2
meeting 30.2
group 29
educator 26
table 26
patient 24.9
room 24.8
executive 24.5
men 24.1
together 23.7
businesswoman 23.6
senior 23.4
sitting 22.3
laptop 22.2
indoors 22
corporate 21.5
teamwork 21.3
happy 21.3
job 21.2
team 20.6
worker 20.2
talking 20
desk 19.8
computer 18.5
home 18.4
nurse 18.3
smiling 18.1
entrepreneur 17.8
manager 17.7
working 17.7
colleagues 17.5
couple 17.4
women 17.4
work 17.3
businesspeople 17.1
indoor 16.4
case 15.6
sick person 14.9
mature 14.9
suit 14.4
elderly 14.4
coworkers 13.8
conference 13.7
education 13
success 12.9
student 12.9
successful 12.8
confident 12.7
communication 12.6
employee 12.5
classroom 12.4
career 12.3
portrait 12.3
lifestyle 12.3
smile 12.1
looking 12
modern 11.9
horizontal 11.7
mid adult 11.6
boss 11.5
cheerful 11.4
casual 11
handsome 10.7
retired 10.7
class 10.6
30s 10.6
kin 10.5
ethnic 10.5
happiness 10.2
hall 10.1
occupation 10.1
planner 10
board 10
associates 9.8
old 9.8
leader 9.6
diversity 9.6
retirement 9.6
four 9.6
workplace 9.5
presentation 9.3
chair 9.1
new 8.9
interior 8.9
diverse 8.8
teaching 8.8
document 8.6
face 8.5
hand 8.4
hospital 7.9
collaboration 7.9
businessmen 7.8
40s 7.8
discussion 7.8
color 7.8
corporation 7.7
busy 7.7
looking camera 7.7
studying 7.7
exam 7.7
finance 7.6
screen 7.5
study 7.5
inside 7.4
20s 7.3
grandfather 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 97.7
text 95.1
dress 92.9
clothing 89.3
woman 85.8
sport 79.6
wedding dress 62
posing 41.2

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 99.9%
Surprised 95.1%
Calm 1.7%
Happy 1.3%
Confused 0.7%
Sad 0.4%
Fear 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 45-51
Gender Female, 89.2%
Calm 62.7%
Happy 21.7%
Sad 11.2%
Confused 2.7%
Disgusted 0.5%
Fear 0.5%
Angry 0.4%
Surprised 0.3%

AWS Rekognition

Age 19-27
Gender Male, 96.5%
Happy 50.3%
Fear 21.3%
Calm 11.5%
Angry 7.3%
Sad 4.7%
Surprised 2.4%
Disgusted 1.5%
Confused 0.9%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Happy 62.5%
Sad 21.7%
Surprised 8.6%
Calm 2.9%
Confused 1.8%
Angry 1%
Disgusted 0.8%
Fear 0.7%

AWS Rekognition

Age 48-56
Gender Male, 74.3%
Sad 92.5%
Confused 2.8%
Happy 2%
Calm 1.1%
Disgusted 0.5%
Surprised 0.5%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 48-54
Gender Male, 95.5%
Calm 59.8%
Sad 32.7%
Angry 2.4%
Surprised 2.1%
Fear 1.2%
Confused 0.8%
Disgusted 0.7%
Happy 0.2%

AWS Rekognition

Age 51-59
Gender Female, 55.3%
Happy 93.6%
Calm 5.8%
Sad 0.2%
Fear 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0.1%

AWS Rekognition

Age 38-46
Gender Male, 98.9%
Happy 25.2%
Sad 23.6%
Calm 18.1%
Disgusted 10.9%
Surprised 7.9%
Angry 7.8%
Confused 4.3%
Fear 2.1%

AWS Rekognition

Age 52-60
Gender Male, 84.8%
Happy 45.2%
Sad 21.7%
Surprised 8.4%
Disgusted 7.7%
Calm 6.6%
Angry 4.1%
Fear 3.8%
Confused 2.6%

AWS Rekognition

Age 33-41
Gender Male, 95.6%
Calm 54.1%
Sad 24%
Surprised 10.4%
Disgusted 3.4%
Confused 2.8%
Fear 1.9%
Angry 1.9%
Happy 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Shoe 81.2%

Captions

Microsoft

a group of people posing for a photo 90.4%
a group of people posing for the camera 90.3%
a group of people sitting posing for the camera 90.1%