Human Generated Data

Title

Untitled (man and woman posed for portrait in studio)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1830

Human Generated Data

Title

Untitled (man and woman posed for portrait in studio)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1830

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Furniture 100
Clothing 99.9
Apparel 99.9
Dress 98.9
Person 98.8
Human 98.8
Suit 98.7
Overcoat 98.7
Coat 98.7
Person 98.5
Female 95.7
Chair 95.7
Face 91.9
Stage 86.5
Woman 85.9
Tuxedo 85.8
Sailor Suit 84.6
Shirt 79.3
Flooring 76.3
Sleeve 76.2
Bridegroom 75.1
Wedding 75.1
Man 74.3
Indoors 74.1
Portrait 73.1
Photography 73.1
Photo 73.1
Robe 68.2
Fashion 68.2
Costume 67.2
Girl 66.9
Long Sleeve 63.9
Gown 63
Curtain 60.3
People 59.3
Wedding Gown 58.8
Smile 58.6
Footwear 58.6
Floor 58.3
Table 57.1
Shoe 50.9

Clarifai
created on 2023-10-15

people 99.7
two 98.4
adult 98.2
man 97.6
wedding 96.8
woman 96.3
veil 92.4
bride 92.2
chair 91.4
groom 91.2
wear 91
uniform 88.3
portrait 88
love 86.5
facial expression 86.3
monochrome 86.2
hotel 85.6
actor 85
outfit 82.8
couple 81.3

Imagga
created on 2021-12-14

groom 54.9
suit 28.6
man 27.5
person 26.8
adult 25.8
clothing 24.2
garment 24
people 23.4
male 22
bride 18.8
love 16.6
couple 16.5
portrait 16.2
black 15.7
happiness 14.9
dress 14.4
bow tie 14
wedding 13.8
happy 13.8
fashion 13.6
lady 13
beach 12.6
sunset 12.6
silhouette 12.4
businessman 12.4
necktie 12.1
outdoors 11.9
day 11.8
sexy 11.2
pretty 11.2
attractive 11.2
business 10.9
model 10.9
family 10.7
human 10.5
looking 10.4
men 10.3
summer 10.3
women 10.3
sky 10.2
dark 10
covering 10
hand 9.9
water 9.3
two 9.3
smile 9.3
outdoor 9.2
modern 9.1
pose 9.1
one 9
waiter 8.9
romantic 8.9
style 8.9
consumer goods 8.8
light 8.7
sea 8.6
marriage 8.5
sport 8.4
elegance 8.4
life 8.4
holding 8.2
employee 8.1
dancer 8.1
husband 8
smiling 8
lifestyle 7.9
professional 7.9
brunette 7.8
standing 7.8
corporate 7.7
married 7.7
casual 7.6
walk 7.6
dance 7.6
wife 7.6
passion 7.5
fun 7.5
exercise 7.3
active 7.2
dining-room attendant 7.2
romance 7.1
posing 7.1
face 7.1
performer 7.1
look 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

clothing 89.3
wedding 81.7
black and white 78.8
person 76.6
text 75.7
dress 71.7
man 71.7
wedding dress 70.3
white 66.3
smile 57.7
bride 55.8
footwear 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-68
Gender Male, 98.2%
Calm 71.5%
Happy 19.4%
Surprised 3.5%
Sad 1.9%
Confused 1.4%
Fear 0.8%
Angry 0.8%
Disgusted 0.7%

AWS Rekognition

Age 22-34
Gender Female, 98.2%
Surprised 78.5%
Calm 16.4%
Happy 3.3%
Confused 0.5%
Angry 0.5%
Fear 0.4%
Sad 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Chair 95.7%
Shoe 50.9%

Categories

Imagga

people portraits 88.6%
paintings art 10.2%

Captions

Microsoft
created on 2021-12-14

a man wearing a costume 64.4%
a man standing next to a dog 27.5%

Text analysis

Amazon

TPT

Google

NADON-YTHA2- HAMT2A TPT 1113
NADON-YTHA2-
HAMT2A
TPT
1113