Human Generated Data

Title

Untitled (girl sitting in rocking chair holding doll next to Christmas tree)

Date

1948

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18142

Human Generated Data

Title

Untitled (girl sitting in rocking chair holding doll next to Christmas tree)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18142

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Chair 99.7
Furniture 99.7
Person 98.3
Human 98.3
Tree 96.2
Plant 96.2
Clothing 85.6
Apparel 85.6
Vegetation 85.5
Outdoors 75.4
Yard 72.4
Nature 72.4
Sitting 71.5
Indoors 70.4
Face 68.1
Housing 65.5
Building 65.5
Photography 63.9
Photo 63.9
Couch 62.8
People 62.5
Room 57.3
Fir 56
Abies 56

Clarifai
created on 2023-10-22

people 99.8
group together 99.6
canine 99.5
group 98.7
vehicle 98.7
dog 97.6
adult 94.5
many 94.1
home 93.8
woman 93.6
several 92.9
cavalry 92.7
mammal 92.5
two 91.8
furniture 91.2
man 90.9
carriage 89.5
administration 89
recreation 88.1
four 88

Imagga
created on 2022-03-04

old 29.9
street 29.4
building 27.3
architecture 25
snow 20.7
brick 17.9
house 17.7
wheeled vehicle 17.7
urban 17.5
container 17.4
stone 16.7
mailbox 16.7
city 16.6
town 15.8
travel 14.1
vehicle 14
wall 13.2
box 13
chair 12.8
winter 11.9
road 11.7
night 11.5
light 11.4
construction 11.1
historic 11
window 11
structure 10.9
vintage 10.7
windows 10.6
barbershop 9.6
home 9.6
glass 9.3
tree 9.2
exterior 9.2
door 9.1
trees 8.9
tunnel 8.9
scene 8.7
antique 8.7
seat 8.5
cemetery 8.5
church 8.3
sky 8.3
tourism 8.2
tricycle 8.2
outdoors 8.2
car 8.2
style 8.2
transportation 8.1
history 8
sidewalk 8
cobblestone 7.9
holiday 7.9
ancient 7.8
sepia 7.8
arch 7.7
cold 7.7
residential 7.7
lamp 7.6
shop 7.6
path 7.6
dark 7.5
row 7.4
truck 7.2
gravestone 7.1
gas pump 7.1
conveyance 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

outdoor 97.8
black and white 92.6
text 88.5
old 59

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 97.9%
Calm 94.2%
Disgusted 1.5%
Surprised 1.2%
Happy 0.9%
Confused 0.8%
Angry 0.6%
Sad 0.5%
Fear 0.2%

AWS Rekognition

Age 41-49
Gender Male, 98.6%
Calm 98.6%
Happy 0.5%
Confused 0.3%
Sad 0.3%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.3%

Categories

Captions

Text analysis

Amazon

psr

Google

YT33A2- XAGON
YT33A2-
XAGON