Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

Date

1939

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10920

Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1939

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.9
Human 98.9
Person 98.8
Person 98.4
Person 98.1
Clothing 98
Apparel 98
Person 97.7
Furniture 96.7
Chair 96.7
Person 95.6
Accessory 94.8
Tie 94.8
Accessories 94.8
Indoors 89.7
Person 89.5
Face 86.2
Room 84.7
Person 83.3
Living Room 82.5
Suit 81.6
Coat 81.6
Overcoat 81.6
People 81.5
Table 75.9
Person 75.7
Couch 75.4
Person 73.3
Photography 72.6
Portrait 72.6
Photo 72.6
Flooring 71.9
Tie 70.6
Sitting 68.3
Dress 67.3
Stage 66.4
Floor 65.5
Female 63.1
Screen 61.5
Electronics 61.5
Display 61.1
Monitor 61.1
Wood 60.5
Kid 59.8
Child 59.8
Dining Table 59.8
Baby 57.8

Imagga
created on 2022-02-05

person 33.5
room 32.2
people 30.7
man 25.5
adult 25.1
male 24.8
indoors 24.6
men 23.2
home 23.1
women 21.3
interior 21.2
nurse 19.2
professional 18.8
table 17.6
couple 16.5
teacher 16.4
happy 15.7
patient 15.3
life 15.2
business 15.2
smiling 14.5
lifestyle 14.4
modern 14
shop 13.7
sitting 13.7
worker 13.7
chair 13.5
family 13.3
happiness 13.3
smile 12.8
work 12.6
holding 12.4
barbershop 12.2
office 12.1
group 12.1
indoor 11.9
two 11.9
house 11.7
standing 11.3
hospital 10.9
businesswoman 10.9
classroom 10.7
businessman 10.6
talking 10.5
waiter 10.4
desk 10.4
clothing 10.2
inside 10.1
portrait 9.7
together 9.6
educator 9.5
love 9.5
senior 9.4
mature 9.3
20s 9.2
cheerful 8.9
two people 8.7
corporate 8.6
businesspeople 8.5
executive 8.5
meeting 8.5
casual 8.5
furniture 8
mercantile establishment 7.9
color 7.8
old 7.7
relaxation 7.5
window 7.5
style 7.4
phone 7.4
musical instrument 7.3
new 7.3
brass 7.3
kitchen 7.3
black 7.2
team 7.2
handsome 7.1
medical 7.1
restaurant 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

wedding dress 98.3
bride 95.6
wall 95.2
person 87.6
woman 87.6
indoor 87.3
clothing 86.3
text 84.5
window 83.5
wedding 79.4
dress 77.6
man 69.6
black and white 61.7
table 53.1

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 86%
Calm 74%
Happy 21.1%
Sad 1.6%
Surprised 1.3%
Confused 0.7%
Disgusted 0.6%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 95.6%
Fear 1.2%
Surprised 1.2%
Sad 0.6%
Happy 0.6%
Confused 0.4%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 29-39
Gender Male, 97.4%
Calm 83.7%
Sad 11.8%
Fear 1.5%
Angry 1%
Happy 0.7%
Confused 0.6%
Disgusted 0.5%
Surprised 0.3%

AWS Rekognition

Age 42-50
Gender Male, 100%
Calm 93.8%
Surprised 2.6%
Confused 1.6%
Sad 1%
Disgusted 0.6%
Happy 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Male, 76.4%
Calm 75.3%
Sad 11%
Surprised 3.8%
Angry 3.7%
Happy 2.7%
Confused 1.5%
Disgusted 1.3%
Fear 0.6%

AWS Rekognition

Age 26-36
Gender Female, 53.8%
Sad 94.1%
Calm 4.1%
Disgusted 0.6%
Surprised 0.4%
Fear 0.3%
Confused 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Male, 86.8%
Calm 87.4%
Sad 7.9%
Fear 1.8%
Disgusted 1%
Angry 0.9%
Happy 0.5%
Confused 0.4%
Surprised 0.2%

AWS Rekognition

Age 18-24
Gender Male, 68.2%
Sad 97.2%
Calm 1.1%
Confused 0.4%
Angry 0.3%
Happy 0.3%
Fear 0.3%
Disgusted 0.2%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Tie 94.8%

Captions

Microsoft

a group of people standing in front of a window 82.6%
a group of people standing in front of a mirror 78%
a group of people standing in front of a mirror posing for the camera 67.6%

Text analysis

Amazon

YI33A2
M-11 YI33A2 АЗВА
АЗВА
S
M-11