Human Generated Data

Title

Untitled (boy and girl)

Date

c. 1920

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.781

Human Generated Data

Title

Untitled (boy and girl)

People

Artist: Durette Studio, American 20th century

Date

c. 1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.781

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.2
Human 99.2
Person 98.2
Clothing 98.2
Apparel 98.2
Overcoat 88.3
Officer 77.2
Military Uniform 77.2
Military 77.2
Suit 74.4
People 67
Photography 60.2
Photo 60.2
Door 58.8
Portrait 58.8
Face 58.8
Coat 56.1
Sleeve 55.4

Clarifai
created on 2023-10-15

people 99.7
portrait 98.4
wear 98.1
leader 97.4
adult 96.8
two 96.4
woman 95.3
man 94.1
art 93.2
royalty 91.6
retro 91.2
veil 88.8
affection 88.4
military 87
painting 86.9
outfit 86.1
monarch 85
administration 84.5
group 83.9
elderly 82.8

Imagga
created on 2021-12-14

clothing 22.4
people 22.3
groom 21.3
person 21.3
fashion 20.3
dress 19.9
adult 19.4
man 18.5
bow tie 18.4
garment 17.4
male 15.6
attractive 15.4
statue 15.2
portrait 14.9
black 14.8
couple 14.8
necktie 14.6
lady 14.6
sexy 14.4
model 14
old 13.9
coat 12.7
style 12.6
vintage 12.4
love 11.8
posing 11.5
face 11.4
suit 11.3
body 11.2
women 11.1
girls 10.9
sculpture 10.9
art 10.5
hair 10.3
culture 10.2
smile 10
pretty 9.8
happiness 9.4
happy 9.4
kin 9.3
elegance 9.2
sensuality 9.1
human 9
sepia 8.7
brunette 8.7
lifestyle 8.7
ancient 8.6
men 8.6
bride 8.6
two 8.5
trench coat 8.3
world 8.3
holding 8.2
antique 7.9
youth 7.7
fashionable 7.6
outfit 7.5
city 7.5
business 7.3
romantic 7.1
to 7.1
architecture 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.1
clothing 98.1
person 96.5
posing 95.9
woman 94.5
standing 93.5
old 89.7
dress 89.4
coat 80.5
smile 80.2
vintage clothing 72.6
suit 63.8
footwear 57.9
retro style 55.7
human face 51.3
dressed 37.8
clothes 26.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 15-27
Gender Female, 52.6%
Calm 97.5%
Sad 1.1%
Confused 0.5%
Angry 0.4%
Surprised 0.3%
Happy 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 5-15
Gender Female, 83.7%
Calm 98.3%
Happy 0.9%
Surprised 0.4%
Sad 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 23
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Coat 56.1%

Categories

Imagga

paintings art 99.3%

Text analysis

Amazon

MANCHESTER,N.H.
DURETTE,STUDIO

Google

DURETTE
DURETTE TUDIO MANCHESTER,N.H.
TUDIO
MANCHESTER,N.H.