Human Generated Data

Title

Untitled (young women and two boys posed in dresses and costumes outside school)

Date

c. 1956

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11064

Human Generated Data

Title

Untitled (young women and two boys posed in dresses and costumes outside school)

People

Artist: Claseman Studio, American 20th century

Date

c. 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11064

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 98.9
Person 98.9
Person 97.9
Person 97.9
Person 97.6
Person 96
Person 94.3
Clothing 92.7
Apparel 92.7
Person 92
Person 89.7
Person 85.8
Person 82.4
Person 78.7
Female 77.2
Shorts 73.6
People 70.1
Person 69.6
Dress 68.7
Building 61.5
Girl 60.2
School 59.4
Housing 58.4
Skirt 55.5
Crowd 55.3
Outdoors 55

Clarifai
created on 2019-03-25

people 100
group 99.9
many 99.8
adult 99
group together 99
woman 97.6
child 96.8
wear 95.1
several 94.9
man 94.8
outfit 94.5
education 94.3
leader 93.9
administration 93.9
school 89.5
home 87.3
crowd 85.4
recreation 83.9
military 81.9
uniform 80.6

Imagga
created on 2019-03-25

building 20.3
city 16.6
architecture 16.6
people 16.2
old 14.6
window 14.4
man 14.1
musical instrument 13.3
barbershop 12.7
cemetery 11.7
university 11.7
business 11.5
urban 11.4
male 11.3
travel 11.3
scene 11.2
brass 11.1
history 10.7
room 10.4
women 10.3
shop 10
wind instrument 9.8
office 9.8
night 9.8
chair 9.8
life 9.7
indoors 9.7
love 9.5
men 9.4
historical 9.4
winter 9.4
house 9.3
world 9.1
landmark 9
gate 9
turnstile 8.9
couple 8.7
statue 8.7
sitting 8.6
monument 8.4
street 8.3
tourism 8.2
park 8.2
classroom 8.1
transportation 8.1
religion 8.1
mercantile establishment 8
light 8
person 8
family 8
adult 7.7
school 7.7
kin 7.6
structure 7.5
silhouette 7.4
inside 7.4
historic 7.3
station 7.2
home 7.2
working 7.1
work 7.1
day 7.1
gymnasium 7

Google
created on 2019-03-25

Photograph 97
Snapshot 86.5
Black-and-white 68.3
Room 65.7
History 62.6
Photography 62.4
Family 56.6

Microsoft
created on 2019-03-25

group 70.2
white 65.4
old 52.1
posing 49.2
school 49.2
child 15.2
person 5.5
ceremony 3.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 51.9%
Surprised 45.2%
Confused 45.2%
Calm 46.1%
Disgusted 45.2%
Angry 45.3%
Happy 45.2%
Sad 52.8%

AWS Rekognition

Age 23-38
Gender Female, 51.2%
Happy 46%
Disgusted 49.3%
Sad 47.1%
Angry 45.5%
Confused 45.3%
Surprised 45.5%
Calm 46.3%

AWS Rekognition

Age 38-57
Gender Female, 52.3%
Confused 45.5%
Happy 46.8%
Angry 45.5%
Disgusted 45.4%
Sad 46.2%
Calm 50.1%
Surprised 45.6%

AWS Rekognition

Age 23-38
Gender Female, 54.8%
Happy 50%
Sad 45.6%
Calm 46.8%
Angry 45.9%
Confused 45.9%
Disgusted 45.4%
Surprised 45.5%

AWS Rekognition

Age 23-38
Gender Male, 50.2%
Happy 45.4%
Confused 45.7%
Calm 47%
Sad 50%
Disgusted 45.3%
Angry 46%
Surprised 45.7%

AWS Rekognition

Age 26-43
Gender Male, 51.5%
Confused 45.3%
Happy 45.5%
Sad 46%
Surprised 45.5%
Angry 45.3%
Disgusted 45.2%
Calm 52.3%

AWS Rekognition

Age 35-52
Gender Female, 54.1%
Sad 47.7%
Happy 46.9%
Confused 45.7%
Surprised 46.1%
Disgusted 45.3%
Calm 47.7%
Angry 45.6%

AWS Rekognition

Age 26-43
Gender Female, 50.7%
Calm 53.3%
Disgusted 45.1%
Confused 45.1%
Happy 45.3%
Surprised 45.1%
Angry 45.2%
Sad 45.9%

AWS Rekognition

Age 26-43
Gender Female, 52.1%
Happy 47.3%
Angry 45.6%
Sad 47.2%
Confused 45.5%
Disgusted 45.7%
Calm 48.1%
Surprised 45.7%

Feature analysis

Amazon

Person 98.9%

Categories

Text analysis

Amazon

LIE