Human Generated Data

Title

Untitled (two men and two women posed in studio)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1821

Human Generated Data

Title

Untitled (two men and two women posed in studio)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1821

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.4
Apparel 99.4
Person 99.1
Human 99.1
Person 99
Person 96.5
Person 94.1
Suit 80.2
Coat 80.2
Overcoat 80.2
Sleeve 77.4
Person 75.5
Chair 71.6
Furniture 71.6
Female 71.6
Face 69.3
Portrait 63.8
Photography 63.8
Photo 63.8
Long Sleeve 62.4
People 61.9
Flower 59.6
Plant 59.6
Blossom 59.6
Shoe 59.5
Footwear 59.5
Crystal 58
Person 57.7

Clarifai
created on 2023-10-15

people 99.6
man 98.1
group 97.9
adult 97.3
actor 93.9
woman 93.8
uniform 91.9
portrait 91.9
veil 90.4
group together 89.9
wedding 88.2
wear 82.7
partnership 81.5
squad 80.6
coat 80.2
doctor 79.7
bride 75
healthcare 74.1
medicine 73.8
musician 72.7

Imagga
created on 2021-12-14

male 31.2
people 30.7
man 27.9
silhouette 26.5
person 26.3
business 22.5
human 20.3
suit 20.2
men 19.8
businessman 18.6
professional 15.2
art 14.7
adult 13
fashion 12.8
one 12.7
handsome 12.5
pose 11.8
work 11.8
portrait 11.7
team 11.7
black 11.6
3d 11.6
group 11.3
body 11.2
casual 11
outfit 10.6
graphic 10.2
job 9.7
shape 9.6
standing 9.6
golfer 9.5
stand 9.5
happy 9.4
smile 9.3
clothing 9
style 8.9
family 8.9
success 8.9
women 8.7
render 8.7
crowd 8.6
corporate 8.6
player 8.6
jacket 8.5
studio 8.4
figure 8.3
sport 8.2
shadow 8.1
activity 8.1
looking 8
hair 7.9
design 7.9
hands 7.8
boy 7.7
attractive 7.7
health 7.6
communication 7.6
company 7.4
teamwork 7.4
action 7.4
contestant 7.4
leg 7.4
lady 7.3
exercise 7.3
idea 7.1
posing 7.1
face 7.1
love 7.1
model 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

posing 96.2
text 95.1
clothing 95
person 89
standing 87.3
smile 86.4
dress 82
wedding dress 53.2
woman 52.9
clothes 22.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-69
Gender Male, 89.1%
Calm 76.9%
Surprised 14.3%
Confused 4.6%
Sad 2%
Fear 0.8%
Happy 0.7%
Angry 0.4%
Disgusted 0.2%

AWS Rekognition

Age 22-34
Gender Female, 67.6%
Surprised 47.4%
Calm 22.5%
Happy 21.8%
Fear 3.1%
Sad 1.7%
Angry 1.6%
Confused 1.4%
Disgusted 0.5%

AWS Rekognition

Age 21-33
Gender Male, 83.2%
Calm 56.1%
Happy 22.2%
Sad 7.1%
Surprised 6.4%
Confused 3.3%
Angry 2.8%
Fear 1.3%
Disgusted 0.8%

AWS Rekognition

Age 21-33
Gender Female, 87.9%
Surprised 86.8%
Happy 6.5%
Calm 3.2%
Fear 1.2%
Angry 1.2%
Sad 0.5%
Confused 0.4%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Shoe 59.5%

Categories

Imagga

paintings art 96.5%
people portraits 3.3%

Text analysis

Amazon

MJIR
MJIR YT33A2
YT33A2

Google

MJIA YT3RA2 032MA
MJIA
YT3RA2
032MA