Human Generated Data

Title

Untitled (portrait of three military men and family)

Date

c. 1940

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2097

Human Generated Data

Title

Untitled (portrait of three military men and family)

People

Artist: John Deusing, American active 1940s

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.6
Person 99.6
Person 98.7
Apparel 98.5
Clothing 98.5
Person 98.3
Person 97.1
Person 96.7
Furniture 96.3
Tie 95.3
Accessories 95.3
Accessory 95.3
Chair 91.6
People 86
Stage 60
Sailor Suit 59.7
Nurse 59.5
Curtain 57.7
Food 56.8
Cake 56.8
Icing 56.8
Creme 56.8
Dessert 56.8
Cream 56.8
Photo 55.9
Photography 55.9
Face 55.9
Portrait 55.9
Flooring 55.8

Imagga
created on 2021-12-14

person 39
people 36.2
golfer 35.9
male 31.9
man 31.6
player 30.4
men 28.3
group 27.4
contestant 25.3
adult 24.3
businessman 23.8
business 21.9
human 19.5
professional 19.2
silhouette 19
portrait 16.8
nurse 16.5
suit 16.4
team 16.1
happy 15.7
women 15
corporate 14.6
kin 13.5
teamwork 13
crowd 12.5
couple 12.2
together 11.4
standing 11.3
hands 11.3
worker 10.9
lifestyle 10.8
smile 10.7
job 10.6
life 10.5
success 10.5
sport 10.4
black 10.2
happiness 10.2
light 10
modern 9.8
handsome 9.8
family 9.8
one 9.7
boy 9.6
love 9.5
girls 9.1
attractive 9.1
businesswoman 9.1
fashion 9
athlete 8.9
world 8.9
body 8.8
work 8.6
party 8.6
mother 8.6
walking 8.5
art 8.4
guy 8.3
active 8.2
exercise 8.2
fitness 8.1
dress 8.1
dancer 8.1
office 8
looking 8
bright 7.9
child 7.7
company 7.4
holding 7.4
clothing 7.3
design 7.3
smiling 7.2
fresh 7.2
dance 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

posing 91.1
person 88
text 84.1
clothing 83.2
smile 74.2
man 67.4
sport 66.7

Face analysis

Amazon

Google

AWS Rekognition

Age 48-66
Gender Male, 94%
Sad 48.9%
Calm 36.7%
Angry 5%
Surprised 3.4%
Confused 1.7%
Fear 1.6%
Happy 1.5%
Disgusted 1%

AWS Rekognition

Age 22-34
Gender Male, 82.7%
Angry 38.4%
Calm 25.8%
Surprised 23.3%
Fear 3.8%
Happy 3.4%
Confused 2.6%
Sad 1.5%
Disgusted 1.2%

AWS Rekognition

Age 20-32
Gender Male, 90.8%
Calm 98.1%
Happy 0.9%
Sad 0.3%
Confused 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 22-34
Gender Female, 55.6%
Angry 62.3%
Calm 22.8%
Sad 5.6%
Happy 3%
Surprised 2.9%
Confused 2.1%
Fear 0.9%
Disgusted 0.5%

AWS Rekognition

Age 32-48
Gender Female, 61.8%
Calm 63.6%
Happy 14.1%
Surprised 13.1%
Confused 3%
Sad 2.6%
Fear 2%
Angry 1.1%
Disgusted 0.5%

AWS Rekognition

Age 24-38
Gender Male, 59.3%
Surprised 32%
Calm 30.9%
Happy 22.7%
Angry 8.9%
Fear 2.2%
Confused 1.7%
Sad 0.8%
Disgusted 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Tie 95.3%
Chair 91.6%

Captions

Microsoft

a group of people posing for a photo 95.5%
a group of people posing for the camera 95.4%
a group of people posing for a picture 95.3%

Text analysis

Amazon

SALE
٤١٢
РИЗСО SALE ٤١٢
РИЗСО
U.S.BAA

Google

MJIR YT3RA2 032MA
YT3RA2
MJIR
032MA