Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

Date

1939

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10999

Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99
Person 99
Person 98.9
Person 98.4
Sitting 98.3
Person 96.8
Person 96.6
Person 93.3
Furniture 92.6
Person 91.9
Indoors 88.1
Room 88.1
Crowd 87.8
Person 84.6
Couch 84.3
Apparel 79.5
Clothing 79.5
Overcoat 79
Coat 79
People 79
Waiting Room 74.8
Living Room 69.6
Suit 67.6
Press Conference 60.4
Reception 58.8
Audience 58.7
Reception Room 58.3

Imagga
created on 2022-01-23

businessman 53
business 48
office 45.1
executive 44.2
man 43
meeting 42.4
male 39
person 36
businesswoman 34.5
people 34
group 33.9
corporate 33.5
businesspeople 31.3
team 30.5
men 27.5
adult 26.6
suit 25.8
colleagues 25.3
teamwork 25
professional 24.8
laptop 23.8
work 23.6
together 22.8
sitting 22.3
communication 21.8
desk 21.7
happy 21.3
computer 20.9
manager 20.5
job 20.4
corporation 20.3
discussion 19.5
confident 19.1
indoors 18.5
indoor 18.3
table 18.2
working 17.7
couple 17.4
teacher 17.1
room 16.7
women 16.6
businessmen 16.6
cheerful 16.3
success 16.1
handsome 16
smiling 15.9
coworkers 15.7
conference 15.6
partnership 15.4
talking 15.2
looking 15.2
wind instrument 14.8
partners 14.6
boss 14.3
workplace 14.3
career 14.2
happiness 14.1
discussing 13.8
successful 13.7
smile 13.5
busy 13.5
30s 13.5
worker 13.5
planner 13.2
mature 13
entrepreneur 13
associates 12.8
staff 12.4
oboe 12.3
company 12.1
modern 11.9
occupation 11.9
20s 11.9
40s 11.7
groom 11.3
education 11.3
kin 10.9
musical instrument 10.8
employee 10.7
attractive 10.5
portrait 10.4
lifestyle 10.1
educator 10.1
brass 10
life 9.8
businesswomen 9.8
diverse 9.8
conversation 9.7
partner 9.7
planning 9.6
diversity 9.6
collar 9.6
adults 9.5
two 9.3
presentation 9.3
coffee 9.3
bassoon 8.9
color 8.9
classroom 8.9
30 35 years 8.8
colleague 8.8
cooperation 8.7
mid adult 8.7
jacket 8.7
four 8.6
formal 8.6
chair 8.5
females 8.5
senior 8.4
hall 8.2
board 8.1
interior 8
interaction 7.9
25 30 years 7.8
full length 7.8
employment 7.7
leader 7.7
leadership 7.7
casual 7.6
tie 7.6
friends 7.5
window 7.3
home 7.2
face 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98.8
indoor 98.5
wall 96.7
clothing 96.4
suit 95.8
man 90.4
smile 79.8
room 79.4
people 65.7
group 56.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 37-45
Gender Male, 99.2%
Calm 77.8%
Confused 10.6%
Angry 3.2%
Sad 2.6%
Happy 2.1%
Surprised 1.7%
Fear 1.1%
Disgusted 1%

AWS Rekognition

Age 23-33
Gender Female, 100%
Calm 96.6%
Fear 0.7%
Sad 0.6%
Happy 0.6%
Surprised 0.6%
Confused 0.4%
Angry 0.4%
Disgusted 0.2%

AWS Rekognition

Age 62-72
Gender Female, 100%
Calm 97.5%
Happy 0.9%
Angry 0.5%
Confused 0.4%
Surprised 0.4%
Sad 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Calm 29%
Happy 25.1%
Angry 13.2%
Fear 7.6%
Surprised 7.2%
Sad 6.7%
Disgusted 6.1%
Confused 5.1%

AWS Rekognition

Age 48-54
Gender Male, 99.5%
Calm 99.8%
Confused 0%
Angry 0%
Sad 0%
Happy 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Male, 99.9%
Calm 43.7%
Angry 37.6%
Confused 10.1%
Disgusted 2.2%
Surprised 1.8%
Fear 1.6%
Sad 1.6%
Happy 1.4%

AWS Rekognition

Age 57-65
Gender Male, 100%
Calm 99.8%
Angry 0.1%
Surprised 0%
Confused 0%
Disgusted 0%
Fear 0%
Happy 0%
Sad 0%

AWS Rekognition

Age 18-26
Gender Female, 100%
Happy 29.5%
Calm 19%
Surprised 17.4%
Angry 14.2%
Confused 9.1%
Fear 6.7%
Sad 3.3%
Disgusted 0.9%

AWS Rekognition

Age 4-10
Gender Female, 89.6%
Sad 94.5%
Angry 4%
Calm 0.7%
Confused 0.4%
Fear 0.2%
Surprised 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 2-8
Gender Female, 100%
Calm 50.7%
Sad 26.3%
Angry 6.9%
Fear 4.8%
Happy 3.6%
Disgusted 3.3%
Surprised 2.2%
Confused 2.1%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 43
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Couch 84.3%
Suit 67.6%

Captions

Microsoft

a group of people sitting around a living room 97.3%
a group of people sitting in a room 97.2%
a group of people sitting in chairs in a room 97.1%

Text analysis

Amazon

VOEV
2VEE1X
VOEV 2VEE1X EITA
EITA

Google

VOEV 2VEEIA Eirn
VOEV
2VEEIA
Eirn