Human Generated Data

Title

Untitled (four manequins in shop window)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13416

Human Generated Data

Title

Untitled (four manequins in shop window)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13416

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 98.9
Human 98.9
Person 97
Person 96.8
Person 95.3
Shop 78.6
Clothing 76.7
Apparel 76.7
Horse 73.6
Animal 73.6
Mammal 73.6
Worker 60.8
Hairdresser 59.2
Bag 55.5
Window Display 55.2

Clarifai
created on 2023-10-29

people 99.7
group 98.1
woman 94.9
adult 94.7
group together 94.3
indoors 94.3
furniture 94.2
room 94
wear 93.4
child 93.2
man 90.3
outfit 89.5
monochrome 88.9
school 88.1
department store 86.7
three 85.8
education 85.2
dressing room 85.1
four 84.6
five 82.4

Imagga
created on 2022-01-30

case 91.8
window 38.5
shop 34.5
boutique 26.2
interior 25.6
urban 21.8
business 20.6
barbershop 19.3
city 19.1
modern 18.9
people 18.4
architecture 17.9
glass 17.9
light 17.4
mercantile establishment 17.1
women 15.8
building 15.5
house 15
shopping 13.7
adult 13.6
office 13.2
table 13
inside 12.9
men 12.9
furniture 12.8
indoor 12.8
home 12.7
mall 12.7
chair 12.4
buy 12.2
design 11.8
transportation 11.6
indoors 11.4
place of business 11.4
room 11.3
fashion 11.3
group 11.3
motion 11.1
life 11
door 10.8
silhouette 10.7
man 10.7
station 10.7
crowd 10.6
store 10.4
wall 10.3
floor 10.2
transport 10
salon 10
hall 10
travel 9.8
retail 9.5
luxury 9.4
construction 9.4
lifestyle 9.4
move 8.6
blurred 8.6
walking 8.5
commercial 8.4
elegance 8.4
wood 8.3
decoration 8.1
open 8.1
activity 8.1
decor 7.9
entrance 7.7
waiting 7.7
clothing 7.6
walk 7.6
center 7.6
restaurant 7.5
style 7.4
color 7.2
black 7.2
day 7.1

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

clothing 96.7
text 96.1
person 95.3
black and white 92.1
woman 84.5
man 75.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 13-21
Gender Male, 98.6%
Calm 75.7%
Sad 14.5%
Surprised 3.8%
Confused 3.6%
Angry 0.9%
Disgusted 0.6%
Happy 0.5%
Fear 0.4%

Feature analysis

Amazon

Person
Horse
Person 98.9%

Categories

Text analysis

Amazon

tim
KODAK--EITW

Google

MJIA--YT33A2--AGO
MJIA--YT33A2--AGO