Human Generated Data

Title

Untitled (couple with carnival prizes)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7746

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple with carnival prizes)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7746

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Clothing 99
Apparel 99
Person 98.9
Face 97.6
Female 97.3
Dog 95
Canine 95
Animal 95
Mammal 95
Pet 95
Person 94.9
Puppy 93.3
Woman 84.5
Smile 82.9
Girl 82.2
Costume 80.3
Countryside 78.1
Building 78.1
Outdoors 78.1
Shelter 78.1
Rural 78.1
Nature 78.1
Portrait 75.7
Photography 75.7
Photo 75.7
Kid 75.7
Child 75.7
Dress 74.2
Crowd 71.6
People 65.8
Man 64.2
Glasses 62.7
Accessories 62.7
Accessory 62.7
Person 61.8
Pants 59.6
Indoors 58.7
Floor 58.5
Play 56.1
Shorts 56.1
Person 46.4

Clarifai
created on 2023-10-25

people 100
group together 99
monochrome 98.6
adult 98.5
group 98
recreation 97.8
child 97.6
man 97.3
woman 96.5
many 96.5
wear 95.4
music 92.4
actress 91.9
enjoyment 91.8
outfit 91.2
several 88.7
musician 88.5
sports equipment 84.4
guitar 83.5
crowd 82.9

Imagga
created on 2022-01-09

sport 32.4
man 22.8
people 17.8
person 17.5
male 17.1
pooch 16
adult 14.9
active 14.5
exercise 13.6
fitness 13.5
black 13.2
athlete 13
street 12.9
men 12.9
fun 12.7
city 12.5
portrait 12.3
ball 12.2
outdoor 11.5
dress 10.8
weapon 10.5
summer 10.3
lifestyle 10.1
world 10
leisure 10
recreation 9.9
run 9.6
body 9.6
play 9.5
wall 9.4
tricycle 9.4
action 9.3
sword 9.2
silhouette 9.1
fashion 9
team 9
outdoors 9
wheeled vehicle 8.9
dance 8.9
urban 8.7
boy 8.7
motion 8.6
beach 8.4
dark 8.3
competition 8.2
game 8
business 7.9
travel 7.7
photographer 7.7
pretty 7.7
power 7.6
training 7.4
device 7.3
building 7.2
sunset 7.2
activity 7.2
women 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 96
text 93.8
footwear 92.3
person 90.4
clothing 90
black and white 79.8
cartoon 63.6
woman 59.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-36
Gender Female, 56.6%
Happy 88.5%
Calm 8.6%
Surprised 2.2%
Fear 0.3%
Disgusted 0.2%
Confused 0.2%
Sad 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 95.1%
Happy 53.7%
Calm 21.7%
Surprised 20.4%
Fear 1.7%
Confused 0.8%
Disgusted 0.8%
Angry 0.6%
Sad 0.4%

AWS Rekognition

Age 18-26
Gender Female, 99.9%
Happy 94.6%
Calm 1.5%
Angry 1.1%
Fear 0.9%
Sad 0.5%
Surprised 0.5%
Disgusted 0.5%
Confused 0.3%

Feature analysis

Amazon

Person 99.4%
Dog 95%

Categories

Text analysis

Amazon

SPILL
MILK
Dorothy's
THIS
25
THIS 18
18
sel MILK
DOROTHY S
DOBOTHY
botter
199RE
sel
for
YТ37-X

Google

YT37A2
THIE
Dorothy's
SPIL
25
MILK
DOROTHYS
OROTH
YT37A2 THIE IS Dorothy's SPIL 25 MILK DOROTHYS OROTH
IS