Human Generated Data

Title

Untitled (girl sitting next to christmas tree and radio)

Date

1948

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18140

Human Generated Data

Title

Untitled (girl sitting next to christmas tree and radio)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Human 98.8
Person 98.8
Furniture 98
Chair 96.4
Tree 94
Plant 94
Nature 92.6
Outdoors 91
Yard 87.5
Vegetation 86.7
Indoors 86.6
Room 78.9
Living Room 73.4
Building 72.3
Housing 72.3
Land 69
Fireplace 66.7
People 64.3
Girl 62.5
Female 62.5
Photography 61.3
Photo 61.3
Door 59.6
Snow 59.2
Clothing 59.2
Apparel 59.2
Kid 57
Child 57
Table 55.6
Person 46.3

Imagga
created on 2022-03-04

street 35.9
building 32.2
wheeled vehicle 28
old 25.1
architecture 22.6
city 21.6
vehicle 20.9
urban 20.1
car 19.2
road 19
town 18.5
sidewalk 17.5
stone 16
travel 15.5
snow 15.3
tricycle 14.4
house 14.3
motor vehicle 13.9
container 13.6
night 13.3
brick 13.2
transportation 12.6
chair 11.9
truck 11.7
light 11.4
tree 10.8
vintage 10.7
wall 10.3
historic 10.1
dark 10
tourism 9.9
ambulance 9.8
houses 9.7
automobile 9.6
winter 9.4
window 9.2
windows 8.6
auto 8.6
lamp 8.6
fire engine 8.5
black 8.4
tourist 8.3
transport 8.2
seat 8.2
history 8
cars 7.8
scene 7.8
antique 7.8
outdoor 7.6
roof 7.6
drive 7.6
historical 7.5
bin 7.4
industrial 7.3
dirty 7.2
sky 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

outdoor 98.1
text 97.6
black and white 90.1
old 67.1

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Male, 97.2%
Calm 99.8%
Sad 0%
Confused 0%
Disgusted 0%
Surprised 0%
Happy 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 98%
Calm 99%
Surprised 0.2%
Sad 0.2%
Happy 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%
Fear 0%

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

an old photo of a person 74.6%
a group of people standing in front of a building 64.3%
an old photo of a truck 58.8%

Text analysis

Amazon

est

Google

YT37A2- XAO
YT37A2-
XAO