Human Generated Data

Title

Untitled (couple in front of carnival booth)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7749

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple in front of carnival booth)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.7
Clothing 99.6
Apparel 99.6
Shoe 94.4
Footwear 94.4
Shorts 93.6
Female 87
Pants 86.7
Chair 83.1
Furniture 83.1
Face 73.3
Woman 72.2
Portrait 67.6
Photography 67.6
Photo 67.6
People 66
Dress 65.4
Grass 63.1
Plant 63.1
Door 57.9
Helmet 56
Shoe 54.7

Imagga
created on 2022-01-09

brass 93.6
wind instrument 73.9
trombone 66.5
musical instrument 52.5
weapon 37
cornet 35.8
man 32.9
male 27.6
sport 23.9
playing 21.9
adult 19.4
play 18.1
instrument 18
gun 17.4
bow 17
person 16.3
recreation 16.1
club 16
leisure 15.8
people 15.6
outdoors 14.9
competition 14.6
fun 13.5
game 13.4
golf 13.4
ball 13.1
golfer 12.7
device 12.5
swing 12.1
arrow 11.9
grass 11.9
active 11.8
soldier 11.7
rifle 11.6
player 11.6
summer 10.9
military 10.6
boy 10.4
violin 10.1
guy 10.1
target 10
outdoor 9.9
holding 9.9
posing 9.8
war 9.7
looking 9.6
men 9.4
music 9.4
field 9.2
protection 9.1
danger 9.1
clothing 9
activity 9
handsome 8.9
projectile 8.6
day 8.6
portrait 8.4
child 8.3
exercise 8.2
musician 8.1
lifestyle 7.9
armed 7.9
camouflage 7.8
happiness 7.8
driving 7.7
equipment 7.7
skill 7.7
professional 7.7
attractive 7.7
sky 7.6
hobby 7.6
action 7.5
park 7.4
suit 7.2
kid 7.1
uniform 7
bugle 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 99.7
grass 95.7
text 91.1
standing 80.3
player 80.1
musical instrument 74.7
posing 73.6
dance 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 87.3%
Happy 87%
Calm 7.2%
Surprised 4.3%
Sad 0.5%
Fear 0.5%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 30-40
Gender Female, 81.2%
Calm 70.9%
Happy 18.4%
Surprised 8.7%
Sad 0.7%
Angry 0.5%
Disgusted 0.4%
Confused 0.3%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 94.4%

Captions

Microsoft

a group of people posing for a photo 89.8%
a group of people posing for a picture 89.7%
a group of people posing for the camera 89.6%

Text analysis

Amazon

SA
34624
SINCAPORE SA
SADIES
SINCAPORE
SINGAPORE SADIES
SINGAPORE
"GIRLESK
am
YT37A2-XAQOX

Google

YT3RA2- AGO GIRLESKSINGAFOR S
YT3RA2-
AGO
GIRLESKSINGAFOR
S