Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

Date

1939

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10998

Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.7
Coat 99.7
Overcoat 99.7
Apparel 99.7
Person 98.5
Human 98.5
Tuxedo 98.1
Sitting 97.4
Furniture 96.5
Person 96.3
Chair 95.4
Person 94.9
Person 94.2
Person 92.5
Person 91.1
Person 89.9
Couch 88.4
Crowd 84.9
Person 83.2
Indoors 75.4
Tie 73.8
Accessory 73.8
Accessories 73.8
Waiting Room 73.6
Room 73.6
People 72.1
Reception 71.1
Reception Room 71.1
Plant 65.7
Suit 65.2
Suit 60.4
Audience 59.4
Press Conference 58.2
Dress 56.8
Person 46.2

Imagga
created on 2022-01-23

business 43.1
man 42.3
businessman 41.5
male 36.2
people 35.1
meeting 33.9
office 33.7
person 31.7
kin 30.8
group 30.6
wind instrument 30.1
businesswoman 29.1
adult 28.4
men 28.3
brass 27.3
corporate 26.6
team 26
businesspeople 23.7
executive 23.6
professional 22.9
suit 21.7
computer 21.7
musical instrument 21.4
teamwork 21.3
work 20.4
colleagues 19.4
laptop 19.3
couple 19.2
happy 18.8
communication 18.5
indoors 18.4
sitting 18
corporation 17.4
together 16.6
women 16.6
desk 16.1
mature 15.8
room 15.6
teacher 15.4
manager 14.9
happiness 14.9
businessmen 14.6
handsome 14.3
working 14.1
lifestyle 13.7
indoor 13.7
conference 13.7
job 13.3
table 13
oboe 12.9
looking 12.8
discussion 12.7
employee 12.6
cheerful 12.2
education 12.1
success 12.1
confident 11.8
coworkers 11.8
discussing 11.8
partners 11.7
smiling 11.6
busy 11.6
workplace 11.4
talking 11.4
modern 11.2
20s 11
classroom 10.8
businesswomen 10.8
worker 10.7
planner 10.7
smile 10.7
family 10.7
jacket 10.6
attractive 10.5
associates 9.8
diverse 9.8
partner 9.7
diversity 9.6
partnership 9.6
boss 9.6
chair 9.5
life 9.5
tie 9.5
career 9.5
day 9.4
senior 9.4
presentation 9.3
educator 8.9
40s 8.8
employment 8.7
cooperation 8.7
love 8.7
groom 8.7
planning 8.7
30s 8.7
four 8.6
staff 8.6
formal 8.6
females 8.5
casual 8.5
bassoon 8.4
horizontal 8.4
color 8.3
occupation 8.2
successful 8.2
board 8.1
interior 8
portrait 7.8
mid adult 7.7
married 7.7
college 7.6
adults 7.6
holding 7.4
technology 7.4
coffee 7.4
home 7.2
building 7.1
face 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 98.6
clothing 97.5
suit 97.1
indoor 92.1
man 91.1
smile 85.6
group 84.2
people 73.8

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Calm 99.9%
Angry 0%
Confused 0%
Sad 0%
Happy 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 19-27
Gender Female, 99.4%
Calm 72.4%
Fear 12.5%
Confused 6.6%
Surprised 5.7%
Angry 1%
Sad 0.8%
Happy 0.7%
Disgusted 0.4%

AWS Rekognition

Age 37-45
Gender Male, 100%
Confused 53.8%
Calm 38.3%
Angry 6.2%
Sad 0.7%
Surprised 0.4%
Fear 0.3%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 47-53
Gender Male, 97.3%
Calm 99.1%
Sad 0.2%
Surprised 0.2%
Confused 0.1%
Angry 0.1%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 27.7%
Angry 19.6%
Surprised 16.6%
Happy 9.3%
Confused 9.1%
Disgusted 7.3%
Sad 6.2%
Fear 4.1%

AWS Rekognition

Age 26-36
Gender Female, 99.5%
Disgusted 45.3%
Calm 35.3%
Surprised 12.4%
Confused 4%
Fear 1.4%
Sad 0.6%
Happy 0.6%
Angry 0.4%

AWS Rekognition

Age 2-8
Gender Female, 77%
Angry 37.8%
Calm 35.2%
Sad 13%
Confused 6.3%
Surprised 3.9%
Disgusted 1.9%
Happy 1%
Fear 0.9%

AWS Rekognition

Age 0-4
Gender Female, 99.9%
Calm 72.5%
Sad 16.9%
Angry 3.3%
Happy 2%
Fear 1.8%
Confused 1.6%
Disgusted 1%
Surprised 0.9%

AWS Rekognition

Age 56-64
Gender Male, 100%
Calm 99.9%
Confused 0%
Sad 0%
Surprised 0%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 74-84
Gender Female, 100%
Calm 99.8%
Angry 0%
Confused 0%
Sad 0%
Surprised 0%
Happy 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Tie 73.8%
Suit 65.2%

Captions

Microsoft

a group of people sitting in a room 98.4%
a group of people sitting posing for the camera 96.8%
a group of people sitting and standing in a room 96.7%

Text analysis

Amazon

YT37A2
MJIR YT37A2 ARDA
MJIR
ARDA

Google

2VEEJA
VOLV 2VEEJA EIr
EIr
VOLV