Human Generated Data

Title

Untitled (pageant for parade queen)

Date

1941

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2057

Human Generated Data

Title

Untitled (pageant for parade queen)

People

Artist: Hamblin Studio, American active 1930s

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2057

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 97
Person 96.2
Person 95.7
Person 94.8
Person 94.3
Person 91.8
Person 90.9
Crowd 89.3
Stage 87.9
Clothing 87.3
Apparel 87.3
Person 85.6
Person 84.8
Person 82.5
Person 81.2
Person 80.4
Person 79.7
People 75.9
Person 74.6
Person 71.2
Person 67.6
Person 67.4
Performer 65.8
Person 64.9
Photography 61
Photo 61
Person 57.9
Funeral 56.2
Person 52

Clarifai
created on 2023-10-15

people 99.5
group together 96.5
many 95.7
group 94.8
man 93.9
monochrome 93.1
crowd 91.1
adult 86
ceremony 80.4
woman 80
military 79
war 78.7
soldier 76.9
music 76.6
child 76.1
musician 74.6
street 74.2
uniform 71
leader 69.5
wear 68.2

Imagga
created on 2021-12-14

picket fence 32.3
group 29
silhouette 29
stage 27.5
people 27.3
fence 27.2
crowd 25.9
men 22.3
platform 22.2
business 20
life 20
barrier 19.3
team 18.8
man 17.8
male 17
person 15.8
teamwork 15.8
businessman 15
adult 14.2
city 14.1
art 13.2
urban 13.1
scene 13
obstruction 12.9
flag 12.9
symbol 12.8
women 12.6
design 12.4
structure 12.1
businesswoman 11.8
work 11.8
job 11.5
light 11.4
baron 10.9
sky 10.8
street 9.2
sign 9
black 9
suit 9
landscape 8.9
couple 8.7
standing 8.7
architecture 8.6
outdoor 8.4
house 8.4
human 8.2
spectator 8.2
sun 8
success 8
office 8
sea 7.8
summer 7.7
youth 7.7
friends 7.5
outdoors 7.5
many 7.4
backgrounds 7.3
graphic 7.3
painting 7.2
outfit 7.2
celebration 7.2
day 7.1
vibrant 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 92.8
outdoor 90.5
cemetery 79.1
black 76.7
black and white 76.5
grave 71.5
funeral 70.1
white 65.5
sky 59.9
person 55.3
old 42.6

Color Analysis

Feature analysis

Amazon

Person 96.2%

Categories

Imagga

text visuals 55.8%
paintings art 37.4%
interior objects 6.3%

Text analysis

Amazon

MJ13
MJ13 YY73A2 A7AA
YY73A2
A7AA