Human Generated Data

Title

Untitled (couple with carnival prizes)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7745

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple with carnival prizes)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7745

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Person 99.4
Person 98.7
Apparel 98.3
Clothing 98.3
Person 95
Person 89.7
Costume 89.5
Face 82.5
Female 82.3
Animal 79.9
Mammal 76.3
Dog 75.5
Canine 75.5
Pet 75.5
Photo 70.2
Photography 70.2
Portrait 69.5
People 68.5
Girl 65.3
Coat 65
Dog 64.9
Overcoat 63.5
Suit 63.5
Person 63.1
Man 62.6
Person 62.4
Advertisement 61.5
Poster 61.5
Crowd 60.8
Woman 60.7
Kid 58.6
Child 58.6
Plant 56.3
Tree 56.3

Clarifai
created on 2023-10-25

people 100
group 99
group together 99
recreation 97.6
adult 97.6
monochrome 97.5
man 96.2
woman 96.1
many 95.9
music 95.9
child 95.9
actress 94.4
administration 93.4
wear 93.1
several 91.7
military 91.1
two 90.4
musician 90.4
family 90.1
actor 89.9

Imagga
created on 2022-01-09

man 19.5
city 19.1
male 18.5
sport 17.9
people 16.2
person 16
urban 15.7
portrait 15.5
adult 15.1
black 13.9
weapon 13.9
street 12.9
men 12.9
sword 12.1
play 10.3
wall 10.3
dress 9.9
silhouette 9.9
fashion 9.8
legs 9.4
model 9.3
leisure 9.1
business 9.1
summer 9
team 9
sexy 8.8
women 8.7
outdoor 8.4
attractive 8.4
active 8.1
group 8.1
success 8
building 8
athlete 7.9
high 7.8
run 7.7
human 7.5
life 7.5
action 7.4
art 7.4
world 7.4
clothing 7.3
playing 7.3
body 7.2
game 7.1
day 7.1
travel 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 98.3
footwear 95.8
clothing 92.9
person 90.9
black and white 89.2
text 88.7
cartoon 74.5
man 69.2
street 64.4
posing 64
woman 52.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 75.2%
Happy 83.5%
Calm 12.8%
Surprised 1.5%
Angry 0.8%
Disgusted 0.5%
Confused 0.4%
Fear 0.3%
Sad 0.2%

AWS Rekognition

Age 33-41
Gender Male, 97.2%
Happy 73.5%
Calm 12.9%
Surprised 10.6%
Disgusted 0.9%
Fear 0.9%
Confused 0.5%
Sad 0.5%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Dog 75.5%
Poster 61.5%

Categories

Text analysis

Amazon

25
SPILL
Dorothy's
IS
DOROTHY
THE IS
THE
MILK
09E
food
will
YТ3А-А
DOES

Google

SPILL
MILK
Dorothy's
25
OROTH
YT33A2 DODDTHY THIE IS SPILL MILK Dorothy's 25 OROTH
YT33A2
DODDTHY
THIE
IS