Human Generated Data

Title

Untitled (family in living room)

Date

1951

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21546

Human Generated Data

Title

Untitled (family in living room)

People

Artist: John Howell, American active 1930s-1960s

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 98.9
Person 98.8
Furniture 98.5
Person 97.9
Person 97.8
Room 94.4
Indoors 94.4
Interior Design 94.2
Person 93.7
Person 93.1
Clinic 88.7
Person 86.7
Person 86.6
Couch 85.2
Chair 82.6
Living Room 81.1
People 79.4
Apparel 75.1
Clothing 75.1
Face 72.3
Photography 64.4
Portrait 64.4
Photo 64.4
Bedroom 61.8
Bed 61.5
Hospital 58.6
Chair 58.5
Table 58
Person 55.9
Workshop 55.4
Operating Theatre 55.2

Imagga
created on 2022-03-05

person 44.1
nurse 39.3
man 36.3
male 31.9
people 30.1
patient 28.5
adult 23.7
teacher 21.7
group 20.1
case 20
sick person 19.4
businessman 19.4
room 18.8
classroom 18.3
business 17.6
student 17.2
team 17
men 16.3
education 15.6
black 15
professional 14.4
happy 14.4
planner 14.3
school 14
sport 13
blackboard 12.5
portrait 11.6
lifestyle 11.6
holding 11.5
interior 11.5
hand 11.4
meeting 11.3
standing 11.3
human 11.2
casual 11
class 10.6
together 10.5
player 10.5
senior 10.3
mature 10.2
ball 10.2
smiling 10.1
active 10
smile 10
board 9.9
teaching 9.7
child 9.7
sitting 9.4
youth 9.4
two 9.3
businesswoman 9.1
world 9
health 9
office 8.9
to 8.8
home 8.8
colleagues 8.7
work 8.7
boy 8.7
30s 8.7
happiness 8.6
play 8.6
drawing 8.6
modern 8.4
teamwork 8.3
exercise 8.2
equipment 8.1
family 8
looking 8
job 8
women 7.9
hands 7.8
discussion 7.8
studying 7.7
athlete 7.5
cheerful 7.3
indoor 7.3
musical instrument 7.2
working 7.1
medical 7.1
hospital 7
indoors 7

Google
created on 2022-03-05

Style 83.8
Picture frame 80.7
Adaptation 79.3
Event 72.7
Vintage clothing 71.6
Monochrome 71.2
Art 70.8
Hat 69.8
Room 67.5
Table 67
Sitting 66.2
Monochrome photography 64.1
Shelf 64
T-shirt 63.5
History 61.4
Chair 61.1
Child 61
Classic 60.4
Photo caption 56.8
Font 53.9

Microsoft
created on 2022-03-05

text 98.9
person 98.9
clothing 92.2
drawing 63.4
cartoon 61.7

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 99.7%
Calm 87.2%
Sad 7.4%
Confused 3.1%
Happy 0.9%
Angry 0.5%
Disgusted 0.5%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 45-53
Gender Male, 94.9%
Calm 75.9%
Happy 11.2%
Sad 8.3%
Surprised 1.4%
Confused 1%
Disgusted 0.9%
Angry 0.9%
Fear 0.6%

AWS Rekognition

Age 40-48
Gender Male, 99.2%
Calm 68.3%
Happy 11.2%
Sad 8.4%
Confused 4.5%
Angry 2.6%
Surprised 2.3%
Disgusted 1.9%
Fear 0.8%

AWS Rekognition

Age 49-57
Gender Female, 86.7%
Happy 60.9%
Calm 32.4%
Disgusted 2.2%
Surprised 1.6%
Sad 0.9%
Fear 0.9%
Confused 0.6%
Angry 0.5%

AWS Rekognition

Age 49-57
Gender Male, 94.7%
Happy 62.5%
Surprised 34.4%
Calm 1.7%
Confused 0.9%
Sad 0.2%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 40-48
Gender Female, 59.7%
Sad 67.8%
Calm 18.3%
Happy 6.1%
Confused 3.2%
Surprised 2.4%
Fear 0.9%
Disgusted 0.8%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 82.6%

Captions

Microsoft

a group of people posing for the camera 86.4%
a group of people standing in a room 86.3%
a group of people posing for a picture 86.2%

Text analysis

Amazon

ГОД

Google

YT37A2-XAGO
YT37A2-XAGO