Human Generated Data

Title

Untitled (two girls holding cotton candy in front of circus midway booth)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4583

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two girls holding cotton candy in front of circus midway booth)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4583

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.6
Human 99.6
Person 99.6
Shoe 98.9
Footwear 98.9
Clothing 98.9
Apparel 98.9
Shoe 98.1
Shoe 94.9
Plaid 88.7
Tartan 88.7
Leisure Activities 83
Plant 81.4
Vase 81.4
Potted Plant 81.4
Jar 81.4
Pottery 81.4
Shoe 79.5
Skirt 74.6
Kilt 74.6
Person 69.5
Person 68.9
People 67
Planter 64.2

Clarifai
created on 2023-10-15

people 99.9
group 98.8
many 98
group together 97.6
adult 96.8
music 96.8
man 93.9
woman 93.9
child 93.2
wear 92.4
several 90.8
musician 90.7
outfit 90
recreation 88.9
actress 84.7
monochrome 84.3
guitar 83.8
administration 81.2
singer 80.5
instrument 77.8

Imagga
created on 2021-12-14

blackboard 80.2
grunge 27.2
old 21.6
antique 19.9
vintage 19.8
texture 19.4
art 17.6
aged 16.3
grungy 15.2
pattern 15
wall 14.5
black 13.8
dirty 13.6
shop 13.1
ancient 13
building 13
decoration 12.9
retro 12.3
design 11.8
paint 11.8
architecture 11.7
season 10.9
decorative 10.9
border 10.9
structure 10.8
frame 10.8
urban 10.5
forest 10.4
brown 10.3
house 10.1
worn 9.5
stall 9.4
cemetery 9.2
city 9.1
color 8.9
window 8.8
graphic 8.8
text 8.7
light 8.7
rust 8.7
damaged 8.6
tree 8.5
sign 8.3
landscape 8.2
style 8.2
business 7.9
textured 7.9
snow 7.7
wallpaper 7.7
winter 7.7
dirt 7.6
canvas 7.6
dark 7.5
mercantile establishment 7.4
plants 7.4
man 7.4
rough 7.3
night 7.1
paper 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.6
clothing 97
person 95.4
woman 78.7
old 71
footwear 65.8
white 62.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 17-29
Gender Female, 89.7%
Calm 96.6%
Happy 2.5%
Sad 0.7%
Angry 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 98.9%

Categories

Text analysis

Amazon

Bailey
& Bailey
121
&
LING
TS
2
ROS.

Google

ROS.
TS MAMALA LING ROS. Bailey 121
TS
MAMALA
LING
Bailey
121