Human Generated Data

Title

Untitled (large group of men and women standing in front of building)

Date

1939

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4153

Human Generated Data

Title

Untitled (large group of men and women standing in front of building)

People

Artist: Durette Studio, American 20th century

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4153

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.4
Person 99.4
Person 99.3
Person 99.2
Person 99.1
Person 99
Person 99
Person 98.9
Person 98.4
Person 98.2
Person 97.9
Person 97.5
Person 96.8
Apparel 93.6
Clothing 93.6
People 91.9
Person 89.2
Person 88.5
Drawing 73.2
Art 73.2
Family 67.9
Female 61.3
Crowd 61.2
Sketch 60.5
Sleeve 59.2
Shorts 58.5
Long Sleeve 58.2
Coat 56.3

Clarifai
created on 2019-06-01

people 99.9
many 99.2
group 99.2
group together 98.8
adult 98
woman 97.1
wear 93.7
man 93.1
education 87.5
leader 85.5
several 84.5
school 81.3
crowd 80.4
administration 79.7
chair 78.7
military 78.1
child 77.3
war 75.2
uniform 73.7
queue 70.7

Imagga
created on 2019-06-01

people 21.7
musical instrument 18.7
barbershop 16.4
shop 15.5
city 14.9
boutique 14.4
women 14.2
business 14
old 13.9
adult 13.8
men 13.7
man 13.4
marimba 13.2
brass 12.5
life 12.1
male 11.3
percussion instrument 11.3
architecture 10.9
mercantile establishment 10.9
dress 10.8
wind instrument 10.7
building 10.7
travel 10.6
person 10.1
businessman 9.7
weapon 9.6
window 9.4
trombone 9.3
church 9.2
history 8.9
new 8.9
group 8.9
outfit 8.8
urban 8.7
couple 8.7
love 8.7
scene 8.6
bride 8.6
walking 8.5
chair 8.5
winter 8.5
wedding 8.3
historic 8.2
outdoors 8.2
team 8.1
catholic 7.9
day 7.8
wall 7.7
clothing 7.5
room 7.3
worker 7.2
place of business 7.2
religion 7.2
snow 7.1
interior 7.1
work 7.1

Google
created on 2019-06-01

Photograph 97.1
Snapshot 86.1
Room 65.7
Team 64.1
Photography 62.4
Black-and-white 56.4

Microsoft
created on 2019-06-01

clothing 90.6
person 85.9
woman 84.2
wedding dress 82.8
standing 77.6
dress 77.2
bride 69.8
posing 66
black 65.3
group 65.2
white 65.2
old 50.6
clothes 15.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 53.1%
Disgusted 45.4%
Surprised 45.3%
Angry 45.4%
Confused 45.4%
Sad 51.3%
Calm 46.8%
Happy 45.5%

AWS Rekognition

Age 26-43
Gender Male, 52.6%
Disgusted 45.2%
Calm 45.3%
Sad 53.7%
Confused 45.2%
Angry 45.2%
Surprised 45.1%
Happy 45.4%

AWS Rekognition

Age 35-52
Gender Male, 51.7%
Happy 46.9%
Sad 46.2%
Surprised 45.9%
Confused 45.6%
Disgusted 45.9%
Calm 48.5%
Angry 46%

AWS Rekognition

Age 45-65
Gender Male, 51.4%
Calm 48.6%
Disgusted 45.7%
Sad 47.7%
Surprised 45.7%
Happy 45.5%
Confused 46.2%
Angry 45.6%

AWS Rekognition

Age 35-52
Gender Male, 53.1%
Disgusted 45.2%
Calm 49.3%
Surprised 45.3%
Angry 45.5%
Happy 45.2%
Sad 49.2%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Male, 52.4%
Angry 45.9%
Happy 45.6%
Confused 45.4%
Calm 48.7%
Surprised 45.6%
Sad 47.8%
Disgusted 46.1%

AWS Rekognition

Age 26-44
Gender Male, 53.9%
Angry 45.3%
Happy 45.1%
Calm 52.6%
Surprised 45.2%
Sad 46.5%
Confused 45.2%
Disgusted 45.1%

AWS Rekognition

Age 23-38
Gender Male, 52.4%
Angry 45.1%
Disgusted 45.1%
Confused 45.1%
Calm 45.5%
Sad 51.1%
Surprised 45.1%
Happy 47.9%

AWS Rekognition

Age 48-68
Gender Male, 50.8%
Angry 45.6%
Sad 45.4%
Happy 45.5%
Confused 45.5%
Surprised 45.5%
Calm 48.9%
Disgusted 48.5%

AWS Rekognition

Age 26-43
Gender Male, 54.2%
Disgusted 45.4%
Calm 46.6%
Sad 50.4%
Confused 46.1%
Angry 45.7%
Surprised 45.5%
Happy 45.2%

AWS Rekognition

Age 26-43
Gender Female, 50.9%
Disgusted 45.5%
Calm 51.7%
Angry 45.4%
Sad 45.9%
Happy 45.6%
Surprised 45.5%
Confused 45.5%

AWS Rekognition

Age 35-52
Gender Male, 55%
Sad 45.8%
Happy 45.8%
Surprised 45.2%
Calm 52.7%
Disgusted 45.1%
Confused 45.2%
Angry 45.2%

AWS Rekognition

Age 27-44
Gender Female, 50.5%
Happy 46.1%
Confused 45.5%
Angry 46.4%
Disgusted 45.3%
Surprised 45.3%
Sad 50.3%
Calm 46%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Angry 45.3%
Happy 45.1%
Calm 47.4%
Surprised 45.1%
Sad 51.9%
Confused 45.2%
Disgusted 45.1%

AWS Rekognition

Age 27-44
Gender Female, 50%
Angry 45.5%
Happy 45.4%
Sad 52.3%
Disgusted 45.2%
Confused 45.4%
Calm 45.9%
Surprised 45.2%

AWS Rekognition

Age 16-27
Gender Male, 51.2%
Confused 45.8%
Surprised 45.4%
Sad 50.5%
Angry 45.5%
Happy 45.4%
Calm 46.7%
Disgusted 45.8%

AWS Rekognition

Age 20-38
Gender Female, 53.5%
Happy 45.9%
Confused 45.6%
Angry 46.3%
Disgusted 46.4%
Surprised 45.5%
Sad 48.3%
Calm 46.9%

AWS Rekognition

Age 19-36
Gender Male, 54.5%
Angry 45.6%
Happy 45.6%
Sad 50%
Disgusted 45.2%
Confused 45.8%
Calm 47.1%
Surprised 45.8%

AWS Rekognition

Age 26-43
Gender Female, 53.7%
Disgusted 45.2%
Confused 45.2%
Sad 52.9%
Happy 45.7%
Angry 45.3%
Surprised 45.1%
Calm 45.6%

AWS Rekognition

Age 26-44
Gender Female, 54%
Happy 45.5%
Confused 45.3%
Disgusted 48.4%
Sad 49.4%
Calm 45.2%
Angry 46%
Surprised 45.2%

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

interior objects 98.5%

Text analysis

Amazon

HE