Human Generated Data

Title

Untitled (children playing with ribbons)

Date

1940s

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15879

Human Generated Data

Title

Untitled (children playing with ribbons)

People

Artist: Mary Lowber Tiers, American 1916 - 1985

Date

1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15879

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.7
Human 99.7
Shorts 99.5
Clothing 99.5
Apparel 99.5
Person 99.1
Person 99.1
Person 98.4
Person 98
Person 95.7
Pedestrian 88.5
Person 82.9
Architecture 82.6
Building 82.6
Flooring 79.7
Urban 78.5
Shelter 73.4
Nature 73.4
Outdoors 73.4
Countryside 73.4
Rural 73.4
City 73.1
Town 73.1
Face 67.3
Downtown 67.3
People 66.8
Crowd 66.6
Leisure Activities 65.2
Suit 61.2
Coat 61.2
Overcoat 61.2
Dance Pose 58.9
Female 58.8
Floor 55.6
Back 55

Clarifai
created on 2023-10-29

people 99.8
group 99
many 97.8
architecture 95.8
street 95.7
man 95.5
adult 94.8
group together 94.6
building 92.2
woman 91
several 90.3
art 89.7
monochrome 89
no person 87.2
wear 86.2
crowd 85.4
city 83
child 82.7
illustration 80.9
administration 80.8

Imagga
created on 2022-02-05

negative 30.5
film 29.5
newspaper 29.3
billboard 23.2
product 22.7
daily 22.6
old 21.6
structure 21.6
architecture 21.1
vintage 20.7
building 20
signboard 18
creation 17.9
grunge 17.9
black 17.4
art 16.6
dirty 16.3
city 15.8
retro 15.6
screen 15.1
antique 14.7
photographic paper 14.6
texture 13.9
frame 13.4
material 12.5
sky 12.1
ancient 12.1
travel 12
damaged 11.4
design 11.2
graphic 10.9
border 10.8
symbol 10.8
pattern 10.3
paper 10.2
rough 10
aged 10
landmark 9.9
computer 9.9
photographic equipment 9.8
photographic 9.8
scratch 9.8
business 9.7
collage 9.6
edge 9.6
urban 9.6
grungy 9.5
space 9.3
history 8.9
frames 8.8
slide 8.8
movie 8.7
light 8.7
water 8.7
rust 8.7
mask 8.6
decoration 8.6
skyline 8.5
buildings 8.5
historical 8.5
tourism 8.2
landscape 8.2
paint 8.1
digital 8.1
office 8
noisy 7.9
scratches 7.9
designed 7.9
layered 7.9
text 7.9
mess 7.8
artistic 7.8
noise 7.8
strip 7.8
messy 7.7
your 7.7
layer 7.7
detailed 7.7
dirt 7.6
weathered 7.6
famous 7.4
letter 7.3
night 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 99.2
black and white 93.2
person 82.3
clothing 80
white 72.2
street 70.2
black 69.4
footwear 65.6
old 65.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 69.4%
Calm 92.3%
Confused 2.4%
Sad 1.9%
Surprised 1.3%
Disgusted 1.2%
Angry 0.4%
Happy 0.4%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Female, 51.7%
Calm 84.4%
Happy 7.4%
Sad 6.8%
Confused 0.5%
Angry 0.3%
Disgusted 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 54-62
Gender Male, 99.9%
Calm 98.5%
Sad 0.6%
Confused 0.3%
Happy 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.7%
Person 99.1%
Person 99.1%
Person 98.4%
Person 98%
Person 95.7%
Person 82.9%

Categories

Text analysis

Amazon

SODA

Google

SODA
SODA