Human Generated Data

Title

Untitled (boy and girl between older couple, all standing in front of wall and under trees)

Date

c. 1930

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13072

Human Generated Data

Title

Untitled (boy and girl between older couple, all standing in front of wall and under trees)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.5
Human 99.5
Person 99.3
Person 98.9
Text 92.2
Person 79.5
People 78.8
Newspaper 64.4
Apparel 59.1
Clothing 59.1
Lace 58.3
Poster 57.1
Advertisement 57.1
Door 56.6

Imagga
created on 2022-01-29

vintage 24
blackboard 23.2
film 22.2
black 21
old 19.5
frame 18
grunge 17.9
antique 17.3
retro 17.2
negative 16.4
screen 15.7
television 15.5
art 13.7
aged 12.7
man 12.1
texture 11.8
border 11.7
world 11.6
symbol 11.4
design 11.3
people 11.1
groom 10.9
dirty 10.8
ancient 10.4
portrait 10.3
person 10.2
letter 10.1
graphic 9.5
culture 9.4
structure 9.2
book jacket 9.2
window 9.1
photograph 9.1
paint 9
one 9
decoration 8.9
postmark 8.9
printed 8.8
envelope 8.8
movie 8.7
stamp 8.7
light 8.7
bride 8.6
paper 8.6
mail 8.6
window screen 8.6
monitor 8.6
damaged 8.6
male 8.5
covering 8.4
photographic paper 8.3
historic 8.2
material 8
office 8
painter 8
masterpiece 7.9
fame 7.9
known 7.9
shows 7.9
postage 7.9
postal 7.8
slide 7.8
sepia 7.8
telecommunication system 7.8
post 7.6
grungy 7.6
camera 7.4
global 7.3
protective covering 7.2
jacket 7.1
adult 7.1
icon 7.1
textured 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

television 99.7
monitor 98.2
text 97.9
drawing 97.4
wall 96.3
old 95.6
posing 92.7
person 92.1
clothing 90.9
sketch 90.8
screen 79.9
painting 71.6
image 59.6
human face 58.4
picture frame 53.2
cartoon 52.2
set 41.4
flat 29.2
vintage 27

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 62.8%
Calm 77.7%
Happy 9.9%
Sad 8.3%
Surprised 1%
Angry 1%
Disgusted 0.8%
Confused 0.7%
Fear 0.6%

AWS Rekognition

Age 33-41
Gender Female, 99.8%
Happy 67.9%
Calm 31.2%
Surprised 0.3%
Sad 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Poster 57.1%

Captions

Microsoft

a painting of a flat screen television 84.9%
a person posing for a photo in front of a flat screen television 76.6%
a painting of a flat screen tv 76.5%

Text analysis

Amazon

١١٢٠