Human Generated Data

Title

Untitled (photographers with models on street)

Date

c. 1950

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15736

Human Generated Data

Title

Untitled (photographers with models on street)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15736

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.7
Human 99.7
Person 99.7
Person 98.3
Person 96.3
Person 94.4
Person 93
Clothing 92.7
Apparel 92.7
Person 92.3
Person 90.6
Floor 87.2
Flooring 86.1
Person 83.2
Art 73.3
Dance Pose 71.8
Leisure Activities 71.8
Female 67.1
People 66.1
Person 61.3
Face 61.1
Girl 59.6
Dress 58.7
Costume 58.4
Drawing 58.1
Stage 57.8
Text 55.4

Clarifai
created on 2023-10-28

people 99.7
group 93.9
adult 93.6
man 92.8
illustration 92.1
street 92
wear 91.3
music 91.2
art 91
woman 88.9
vintage 87.9
room 84.5
dress 83.9
group together 82.3
old 82.1
monochrome 81.6
outfit 81.3
many 80.3
veil 78
musician 77.9

Imagga
created on 2022-02-05

blackboard 41.6
decoration 29
graffito 24.8
old 22.3
vintage 20.7
retro 19.6
grunge 19.6
antique 19
art 18.9
frame 17.5
texture 17.4
interior 15.9
empty 15.5
aged 14.5
wall 14.3
design 13.5
fire screen 13.4
screen 12.9
room 12.6
material 12.5
window 12.4
textured 12.3
ancient 12.1
home 12
house 11.7
wood 11.7
outdoors 11.2
architecture 11
city 10.8
inside 10.1
structure 10
decorative 10
building 9.7
detail 9.6
shop 9.5
protective covering 9.5
old fashioned 9.5
door 9.3
border 9
religion 9
brown 8.8
urban 8.7
paper 8.7
luxury 8.6
rusty 8.6
wallpaper 8.4
black 8.4
element 8.3
historic 8.2
sculpture 8.2
dirty 8.1
decor 8
wooden 7.9
film 7.7
decay 7.7
blank 7.7
stained 7.7
damaged 7.6
covering 7.6
man 7.4
light 7.3
indoor 7.3
people 7.2
modern 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 98.4
group 56.1
black and white 54

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.9%
Calm 96.2%
Sad 1.5%
Happy 0.8%
Fear 0.5%
Surprised 0.4%
Angry 0.4%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 35-43
Gender Female, 87.8%
Calm 76.4%
Happy 11.4%
Fear 3.9%
Angry 3.4%
Sad 2.3%
Surprised 1.2%
Confused 0.7%
Disgusted 0.6%

AWS Rekognition

Age 39-47
Gender Male, 95.8%
Happy 64.2%
Disgusted 21.2%
Angry 6.7%
Calm 6.2%
Sad 0.9%
Surprised 0.3%
Fear 0.3%
Confused 0.3%

AWS Rekognition

Age 36-44
Gender Female, 89.3%
Calm 98.4%
Sad 0.9%
Disgusted 0.2%
Happy 0.2%
Confused 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 50-58
Gender Male, 92.8%
Calm 99.8%
Happy 0.1%
Surprised 0%
Confused 0%
Sad 0%
Angry 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.7%
Person 98.3%
Person 96.3%
Person 94.4%
Person 93%
Person 92.3%
Person 90.6%
Person 83.2%
Person 61.3%

Categories

Text analysis

Amazon

alqoaq
-
VITTA
BAEI
IN

Google

etmi RKINAII sincag
etmi
RKINAII
sincag