Human Generated Data

Title

Untitled (child on parade float of car covered with flowers, close-up of young girl and flowers)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13465

Human Generated Data

Title

Untitled (child on parade float of car covered with flowers, close-up of young girl and flowers)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13465

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Art 95.2
Painting 92.3
Person 92.1
Human 92.1
Plant 89.2
Person 81.2
Flower 65
Blossom 65
Leaf 57.2
Floral Design 57.2
Graphics 57.2
Pattern 57.2

Clarifai
created on 2023-10-29

people 99.8
illustration 99.1
adult 99
group 98.7
print 97.4
wear 97.4
woman 96.2
art 94.5
man 90.2
flower 88
many 86.6
child 85.4
administration 84.6
portrait 84.5
engraving 82.1
vintage 79.7
tree 78.8
veil 78.2
one 77.5
painting 77.4

Imagga
created on 2022-01-30

fountain 53.8
structure 50.5
grunge 49.4
aged 36.2
texture 35.4
old 32.7
vintage 30.6
dirty 28.9
antique 26
wallpaper 23
stain 22.1
pattern 21.9
grungy 19.9
canvas 19
art 18.9
aging 18.2
frame 17.5
black 17.4
wall 16.5
retro 16.4
paint 16.3
material 16.1
graphic 16
rough 15.5
brown 15.5
design 15.2
surface 15
decoration 14.8
damaged 14.3
rusty 14.3
weathered 14.2
textured 14
obsolete 13.4
artistic 13
ancient 13
paper 12.5
decay 12.5
silhouette 12.4
backgrounds 12.2
detail 12.1
floral 11.9
border 11.8
graffito 11.7
color 11.7
drawing 11.7
space 11.6
stained 11.5
worn 11.5
splash 11.4
text 11.3
grain 11.1
effect 10.9
flower 10.8
grime 10.7
mottled 10.7
artwork 10.1
backdrop 9.9
fracture 9.7
messy 9.7
rust 9.6
detailed 9.6
dirt 9.5
textures 9.5
empty 9.4
light 9.4
dark 9.2
element 9.1
style 8.9
painterly 8.9
stains 8.8
burnt 8.7
crack 8.7
forest 8.7
rock 8.7
edge 8.7
spray 8.6
yellow 8.6
spider web 8.6
stone 8.5
plants 8.3
country 7.9
smudged 7.9
urban 7.9
distressed 7.9
torn 7.7
parchment 7.7
japan 7.6
graphics 7.6
poster 7.6
world 7.4
digital 7.3
global 7.3

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

text 95.1
drawing 92.7
flower 85
sketch 83.9
painting 72.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 67%
Happy 56.1%
Calm 24.5%
Sad 7.3%
Surprised 4.1%
Confused 3%
Disgusted 2.1%
Fear 1.9%
Angry 1.1%

AWS Rekognition

Age 21-29
Gender Male, 90.1%
Sad 77%
Disgusted 7.9%
Calm 4.4%
Fear 3.1%
Happy 2.8%
Surprised 2.6%
Confused 1.2%
Angry 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Painting
Person
Painting 92.3%
Person 92.1%
Person 81.2%

Categories

Text analysis

Amazon

12
THE

Google

12
12