Human Generated Data

Title

Untitled (teenagers on steps wearing signs, initiation day for sophomores)

Date

1959

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18870

Human Generated Data

Title

Untitled (teenagers on steps wearing signs, initiation day for sophomores)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18870

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Person 99.6
Person 99.5
Person 99.3
Person 99.1
Shoe 97.5
Footwear 97.5
Clothing 97.5
Apparel 97.5
Shoe 91.2
Chess 87.9
Game 87.9
Shoe 75.3
Shorts 70.1
Shoe 65.8
Shoe 59.3
Person 43.7

Clarifai
created on 2023-10-22

people 100
group together 99.6
many 99.1
group 98.4
adult 97.8
child 97.3
several 96.6
man 96.5
woman 95.8
recreation 92.8
education 89.7
five 88.4
boy 88.1
wear 87.5
school 86.5
street 85.5
four 83.2
monochrome 81.7
uniform 79.2
athlete 78.8

Imagga
created on 2022-03-05

people 26.8
man 23.8
city 19.1
person 18.4
silhouette 18.2
adult 17.7
kin 15.6
musical instrument 15.3
black 14.5
urban 14
men 12.9
street 12.9
male 12.9
fashion 12.8
business 12.8
women 12.6
sitting 12
sport 11.4
legs 11.3
boy 11.3
group 11.3
accordion 11
dark 10.9
life 10.7
businessman 10.6
walking 10.4
style 10.4
chair 10.3
exercise 10
wind instrument 9.8
human 9.7
body 9.6
couple 9.6
motion 9.4
lifestyle 9.4
youth 9.4
model 9.3
building 9.3
alone 9.1
portrait 9.1
keyboard instrument 9
sunset 9
cool 8.9
sexy 8.8
child 8.8
travel 8.4
seat 8.4
teenager 8.2
music 8.2
sun 8
pedestrian 7.8
run 7.7
summer 7.7
walk 7.6
window 7.5
one 7.5
outdoors 7.5
performer 7.4
action 7.4
looking 7.2
team 7.2
world 7.1
posing 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

statue 93.4
black and white 90.4
text 87.9
outdoor 85.4
person 85.3
footwear 83.7
clothing 83.7
street 51.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 99.6%
Calm 98.9%
Sad 0.7%
Happy 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 82.5%
Calm 56.7%
Happy 24.7%
Sad 11.6%
Confused 2.3%
Disgusted 1.8%
Surprised 1.6%
Angry 0.9%
Fear 0.4%

AWS Rekognition

Age 24-34
Gender Female, 93.7%
Calm 99.7%
Sad 0.2%
Surprised 0%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Female, 97.1%
Calm 95.3%
Sad 3%
Angry 0.7%
Happy 0.3%
Surprised 0.2%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.8%
Person 99.6%
Person 99.5%
Person 99.3%
Person 99.1%
Person 43.7%
Shoe 97.5%
Shoe 91.2%
Shoe 75.3%
Shoe 65.8%
Shoe 59.3%

Text analysis

Amazon

A
I'M A
I'M
I AMA
10
RETURNABLE
SOPHOMORE
MAIS
andwore
взаня
AMER
LUB
ora
saapa
RIA ALL
STORE

Google

PHOMORE I AMA PFTURNABLE
PHOMORE
I
AMA
PFTURNABLE