Human Generated Data

Title

Untitled (men doing road work; East Derry, NH)

Date

1950

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18381

Human Generated Data

Title

Untitled (men doing road work; East Derry, NH)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Human 99.8
Person 99.8
Person 99.8
Person 99.6
Person 99.6
Person 99.4
Person 97.9
Path 91.3
Outdoors 90.3
Tarmac 84.7
Asphalt 84.7
Nature 82
Road 78.1
Person 76.6
Clothing 76.2
Shorts 76.2
Apparel 76.2
Tree 69.4
Plant 69.4
Pedestrian 68.5
Urban 67.5
Person 61.6
Pavement 60.5
Sidewalk 60.5
Vegetation 59.7
Street 59.1
Building 59.1
City 59.1
Town 59.1
Land 58.5
Yard 58
Person 56
Person 42.2

Imagga
created on 2022-03-04

structure 27.3
mechanical device 27.3
trampoline 25.7
tree 22.7
trees 21.3
mechanism 20.3
building 20.2
gymnastic apparatus 20.1
landscape 20.1
park 19.8
swing 18.8
forest 18.3
greenhouse 18.1
sports equipment 17.3
sprinkler 16.7
grass 16.6
dark 15.9
light 14
plaything 13.6
night 13.3
outdoor 13
travel 12.7
road 12.6
equipment 12.4
outdoors 12.2
shopping cart 11.7
summer 11.6
path 11.3
scene 11.2
old 11.1
day 11
wheeled vehicle 10.6
snow 10.5
architecture 10.3
winter 10.2
sky 10.2
fountain 10.1
house 10
water 10
rural 9.7
mist 9.7
autumn 9.7
handcart 9.7
fog 9.6
street 9.2
sport 9
misty 8.8
woods 8.6
device 8.5
fence 8.4
fun 8.2
fall 8.1
morning 8.1
sun 8
foggy 7.9
early 7.8
cold 7.7
door 7.7
walk 7.6
rain 7.5
lawn 7.5
sliding door 7.5
field 7.5
wood 7.5
city 7.5
environment 7.4
home 7.2
barrier 7.2
spring 7.1
season 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

outdoor 99.6
black and white 98.1
playground 92.1
text 91.1
person 89.7
street 86.4
monochrome 83.2
black 69.3
white 66.5
tree 64.6
footwear 64.6
clothing 55.6

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 99.2%
Calm 98.4%
Happy 0.4%
Surprised 0.3%
Sad 0.3%
Disgusted 0.2%
Confused 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 25-35
Gender Male, 72.7%
Calm 99.5%
Sad 0.1%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Confused 0%
Surprised 0%
Happy 0%

AWS Rekognition

Age 23-33
Gender Male, 89.8%
Happy 92.1%
Calm 3.1%
Sad 2%
Angry 1.3%
Surprised 0.7%
Fear 0.4%
Disgusted 0.3%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a vintage photo of some people that are standing in the street 66.4%
a vintage photo of a group of people standing in front of a building 66.3%
a vintage photo of a group of people standing outside of a building 66.2%

Text analysis

Amazon

32