Human Generated Data

Title

Untitled (group posing in front of busses)

Date

1938

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4301

Human Generated Data

Title

Untitled (group posing in front of busses)

People

Artist: Durette Studio, American 20th century

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4301

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99
Person 99
Person 97.1
Person 97
Person 95.4
Person 94.1
Transportation 92.2
Vehicle 92.2
Bus 92.2
Person 91.7
Person 91.3
Person 90.9
Nature 88.7
Outdoors 87
Rural 83.3
Shelter 83.3
Countryside 83.3
Building 83.3
Tree 83.1
Plant 83.1
Person 81.5
Person 80.4
Person 77.4
Person 74
Crowd 70.2
Person 69.7
People 68.6
Person 68.4
Pedestrian 66.5
Ice 66.3
Person 63.3
Housing 62.6
Street 60.9
Road 60.9
Urban 60.9
City 60.9
Town 60.9
Architecture 60.1
Military 57.8
Funeral 56
Snow 56
Villa 55.8
House 55.8
Person 52.8
Person 44

Clarifai
created on 2019-06-01

people 99.7
group together 99
vehicle 97.4
many 97.2
transportation system 96.8
group 96.8
monochrome 96
cavalry 95.2
street 94.7
adult 92
crowd 90.9
man 90.7
war 84.7
military 83.7
home 82.9
sepia 82.8
black and white 82.3
carriage 81.3
road 79
administration 78.2

Imagga
created on 2019-06-01

snow 60.9
picket fence 54.6
fence 50.4
building 34.8
structure 33.8
barrier 33.8
architecture 32.7
city 31.6
weather 29.2
winter 23.9
sky 23.8
house 22.8
obstruction 22.7
travel 22.6
history 21.5
old 20.9
street 18.4
night 17.8
tree 17.1
trees 16
urban 15.7
tourism 15.7
landscape 15.6
cold 15.5
town 14.9
river 14.3
road 13.6
religion 13.5
light 13.4
construction 12.8
landmark 12.7
park 12.1
palace 12.1
church 12
culture 12
historic 11.9
season 11.7
water 11.4
historical 11.3
tourist 10.9
tower 10.8
wall 10.5
ancient 9.5
capital 9.5
day 9.4
outdoors 9
bridge 8.7
residential 8.6
vintage 8.3
vacation 8.2
scenery 8.1
new 8.1
home 8
fountain 7.9
scenic 7.9
holiday 7.9
black 7.8
scene 7.8
district 7.8
cloud 7.8
sunny 7.8
clouds 7.6
real 7.6
sunrise 7.5
window 7.5
evening 7.5
famous 7.5
temple 7.4
exterior 7.4
sun 7.3
cemetery 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

outdoor 88.3
white 76.2
person 74.4
tree 69.4
old 66.6
black and white 59.3
sky 52.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-53
Gender Female, 50.1%
Disgusted 49.7%
Angry 49.6%
Confused 49.6%
Sad 49.6%
Happy 49.7%
Calm 49.8%
Surprised 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Disgusted 49.5%
Confused 49.5%
Angry 49.5%
Calm 49.6%
Surprised 49.5%
Happy 49.8%
Sad 50%

AWS Rekognition

Age 26-44
Gender Female, 50%
Angry 49.5%
Sad 49.9%
Happy 49.6%
Confused 49.5%
Surprised 49.5%
Calm 49.9%
Disgusted 49.5%

AWS Rekognition

Age 15-25
Gender Female, 50.4%
Confused 49.6%
Happy 49.7%
Surprised 49.6%
Angry 49.6%
Disgusted 49.7%
Calm 49.6%
Sad 49.7%

AWS Rekognition

Age 23-38
Gender Female, 50.2%
Angry 49.5%
Sad 49.6%
Surprised 49.5%
Happy 49.5%
Calm 50.2%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.2%
Disgusted 49.5%
Calm 49.6%
Sad 50.2%
Confused 49.5%
Angry 49.6%
Surprised 49.5%
Happy 49.5%

AWS Rekognition

Age 16-27
Gender Female, 50.3%
Disgusted 49.8%
Sad 49.8%
Happy 49.5%
Surprised 49.6%
Angry 49.6%
Calm 49.6%
Confused 49.6%

AWS Rekognition

Age 17-27
Gender Female, 50%
Happy 49.5%
Angry 49.6%
Surprised 49.6%
Disgusted 49.7%
Sad 49.8%
Calm 49.6%
Confused 49.7%

AWS Rekognition

Age 26-43
Gender Male, 50%
Calm 49.9%
Surprised 49.6%
Sad 49.6%
Confused 49.6%
Disgusted 49.7%
Happy 49.6%
Angry 49.7%

AWS Rekognition

Age 35-52
Gender Female, 50.5%
Angry 49.6%
Sad 49.6%
Disgusted 49.7%
Surprised 49.5%
Happy 49.7%
Calm 49.8%
Confused 49.5%

Feature analysis

Amazon

Person 99%
Bus 92.2%

Captions