Human Generated Data

Title

Untitled (two men and two women posed in suits in studio)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1818

Human Generated Data

Title

Untitled (two men and two women posed in suits in studio)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.5
Apparel 99.5
Person 98.9
Human 98.9
Person 98.7
Person 97.9
Dress 96.4
Person 95.2
Female 89.5
Sleeve 87.6
Long Sleeve 79.5
Woman 70.7
Flower 65.1
Blossom 65.1
Plant 65.1
Girl 62.7
Coat 62
Home Decor 59.4
Photography 59.1
Photo 59.1
Portrait 59.1
Face 59.1
Suit 58.6
Overcoat 58.6
Bag 57.4
Shirt 55

Imagga
created on 2021-12-14

people 32.3
brass 31.1
male 29.1
person 27.5
man 24.2
wind instrument 23.9
human 23.2
kin 22.7
adult 20.9
men 19.8
group 19.3
golfer 19.3
cornet 19
business 17
player 16.7
musical instrument 16.3
businessman 15.9
silhouette 15.7
happy 15
contestant 13.7
exercise 13.6
family 13.3
sport 13.3
black 12.6
together 12.3
body 12
bugle 11.8
active 11.7
portrait 11.6
standing 11.3
women 11.1
love 11
team 10.8
handsome 10.7
success 10.5
couple 10.5
happiness 10.2
lifestyle 10.1
suit 10
fashion 9.8
health 9.7
anatomy 9.7
friendship 9.4
teamwork 9.3
art 9.3
smile 9.3
one 9
activity 9
professional 8.9
karate 8.9
sibling 8.8
boy 8.8
graphic 8.8
hands 8.7
smiling 8.7
crowd 8.6
youth 8.5
old 8.4
joy 8.4
girls 8.2
healthy 8.2
pose 8.2
fitness 8.1
symbol 8.1
light 8
life 8
clothing 8
child 7.9
work 7.8
fight 7.7
corporate 7.7
summer 7.7
nurse 7.4
mature 7.4
guy 7.4
figure 7.3
design 7.3
dress 7.2
home 7.2
mother 7.1
science 7.1
medical 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

posing 99.1
text 98.4
clothing 94.5
smile 91.2
person 89.3
dress 88.8
standing 88
woman 61.6
old 44.6
male 15.1

Face analysis

Amazon

Google

AWS Rekognition

Age 38-56
Gender Female, 50.1%
Calm 72.5%
Surprised 9.5%
Angry 5.3%
Confused 4.5%
Sad 3.5%
Happy 2.3%
Fear 1.6%
Disgusted 0.8%

AWS Rekognition

Age 22-34
Gender Female, 69.3%
Happy 40%
Surprised 37.2%
Calm 14.8%
Angry 3.1%
Confused 1.6%
Sad 1.5%
Fear 1.3%
Disgusted 0.5%

AWS Rekognition

Age 47-65
Gender Male, 97.6%
Happy 54.8%
Calm 25.7%
Surprised 12.9%
Confused 3.8%
Sad 1.1%
Angry 0.8%
Disgusted 0.7%
Fear 0.2%

AWS Rekognition

Age 22-34
Gender Female, 97.1%
Happy 52.6%
Calm 27.5%
Surprised 10.9%
Sad 4.4%
Fear 2.5%
Confused 1.1%
Angry 0.8%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a group of people posing for a photo 95.1%
a group of people posing for the camera 95%
a group of people posing for a picture 94.9%

Text analysis

Amazon

YT37A2
MJIA YT37A2
MJIA

Google

MJIH YT3HA2 032MA
MJIH
032MA
YT3HA2