Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

Date

1939

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10997

Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 98.8
Sitting 98.8
Person 98.7
Person 97.5
Person 97.5
Person 96.4
Suit 95.4
Clothing 95.4
Overcoat 95.4
Apparel 95.4
Coat 95.4
Person 95.3
Crowd 92.5
Indoors 92.3
Audience 91.9
Room 91.8
Person 91
Person 89.2
Person 88.5
Person 83.1
Accessory 78
Tie 78
Accessories 78
Skin 74.7
Waiting Room 71.6
Suit 71.2
Furniture 71.2
People 71.1
Suit 70.9
Speech 69.1
Couch 59.8
School 59.7
Classroom 59.7
Press Conference 59.6
Reception 58
Reception Room 58
Court 57.2
Tie 50.9

Imagga
created on 2022-01-23

brass 100
wind instrument 95.1
musical instrument 65.3
man 32.2
people 31.8
male 29.8
businessman 29.1
business 28.5
group 26.6
cornet 25.3
men 24.9
meeting 24.5
person 22.7
office 21.7
adult 21.3
room 19.9
team 19.7
table 19
indoors 18.4
businesswoman 18.2
work 16.5
communication 15.9
interior 15.9
together 15.8
corporate 15.4
sitting 15.4
suit 15.3
couple 14.8
professional 14.8
indoor 14.6
smiling 14.5
happy 14.4
businesspeople 14.2
chair 14.2
teamwork 13.9
education 13.8
executive 13.8
computer 13.6
home 13.5
women 13.4
modern 13.3
teacher 13
laptop 12.7
handsome 12.5
talking 12.3
desk 12.3
manager 12.1
presentation 12.1
classroom 12
happiness 11.7
conference 11.7
lifestyle 11.6
holding 11.5
board 10.8
trombone 10.8
discussion 10.7
colleagues 10.7
working 10.6
success 10.4
cheerful 9.7
class 9.6
friends 9.4
two 9.3
mature 9.3
smile 9.3
occupation 9.2
girls 9.1
school 9
family 8.9
job 8.8
looking 8.8
students 8.8
partner 8.7
oboe 8.6
enjoying 8.5
black 8.4
study 8.4
attractive 8.4
coffee 8.3
leisure 8.3
phone 8.3
confident 8.2
life 7.9
standing 7.8
portrait 7.8
restaurant 7.8
boys 7.7
youth 7.7
casual 7.6
student 7.5
baritone 7.5
successful 7.3
worker 7.2
device 7.1

Google
created on 2022-01-23

Suit 75
Chair 74.8
Event 73.2
Vintage clothing 72.4
Classic 72.1
Monochrome 67.5
Monochrome photography 66.1
Curtain 65.7
Room 65.6
Sitting 64.8
History 64.6
Stock photography 61.9
Team 59.4
Crew 56
Formal wear 52.5
Family 51.5

Microsoft
created on 2022-01-23

clothing 96.6
wall 96.5
person 95.9
suit 93
indoor 90.8
man 89.7
smile 84.3
woman 80.3
group 76.4
posing 53.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 42-50
Gender Male, 100%
Calm 99.7%
Confused 0.1%
Angry 0%
Surprised 0%
Sad 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 64-74
Gender Female, 100%
Calm 99.8%
Angry 0.1%
Surprised 0%
Happy 0%
Confused 0%
Disgusted 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 59-69
Gender Male, 100%
Calm 53.2%
Confused 28.5%
Surprised 6%
Angry 3.3%
Fear 3.1%
Disgusted 2.2%
Sad 2.2%
Happy 1.4%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Happy 46.7%
Calm 19.9%
Angry 11%
Surprised 7.8%
Confused 6.4%
Fear 3.1%
Disgusted 2.8%
Sad 2.3%

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Calm 98.1%
Angry 0.8%
Confused 0.6%
Happy 0.2%
Disgusted 0.1%
Sad 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Female, 99.8%
Calm 53.3%
Angry 20.4%
Surprised 14.6%
Confused 5.7%
Happy 2.5%
Sad 1.6%
Fear 1.3%
Disgusted 0.6%

AWS Rekognition

Age 24-34
Gender Female, 100%
Calm 86.8%
Surprised 4.2%
Angry 2.1%
Sad 2.1%
Fear 1.4%
Happy 1.2%
Confused 1.2%
Disgusted 1%

AWS Rekognition

Age 35-43
Gender Male, 93.1%
Calm 97%
Angry 1.2%
Confused 0.6%
Surprised 0.3%
Sad 0.3%
Fear 0.2%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 2-10
Gender Female, 98.7%
Calm 91.7%
Sad 4%
Confused 2.1%
Angry 0.7%
Fear 0.5%
Happy 0.3%
Surprised 0.3%
Disgusted 0.2%

AWS Rekognition

Age 2-10
Gender Female, 97.7%
Angry 83.8%
Calm 12.5%
Sad 1.6%
Surprised 0.5%
Disgusted 0.4%
Confused 0.4%
Happy 0.4%
Fear 0.3%

Microsoft Cognitive Services

Age 61
Gender Male

Microsoft Cognitive Services

Age 51
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Suit 95.4%
Tie 78%

Captions

Microsoft

a group of people posing for a photo 97%
a group of people posing for the camera 96.9%
a group of people posing for a picture 96.8%

Text analysis

Amazon

YT33A2
MJI7 YT33A2 A3DA
A3DA
MJI7