Human Generated Data

Title

Untitled (beach scene)

Date

1976

People

Artist: Sage Sohier, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1130

Copyright

© Sage Sohier

Human Generated Data

Title

Untitled (beach scene)

People

Artist: Sage Sohier, American born 1954

Date

1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1130

Copyright

© Sage Sohier

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.5
Human 99.5
Person 99.2
Person 97.2
Person 96.9
Art 95.2
Bird 93.9
Animal 93.9
Drawing 89.4
Person 87.8
Bird 84.6
Person 76.5
Bird 75.9
Sketch 73.7
Painting 68.9
Bird 67.8
Person 62.1
Bird 62
Nature 60.6
Outdoors 60.6
Person 60.5

Clarifai
created on 2023-10-25

people 99.8
wear 98.6
group 98.5
print 98.4
art 97.5
man 97.3
group together 95.7
adult 95.3
family 93.2
child 92.2
lid 90.4
portrait 89.7
three 89.7
illustration 89.6
woman 88.9
painting 86.7
retro 84.6
boy 83.8
monochrome 83.6
several 83.1

Imagga
created on 2022-01-09

silhouette 33.9
grunge 33.2
vintage 22.3
art 21.2
decoration 21
drawing 20.8
sketch 20.4
old 20.2
graffito 19.9
texture 18.7
design 18.6
black 18
frame 17.9
retro 17.2
gymnasium 16
paint 15.4
dirty 15.4
graphic 13.1
antique 13
aged 12.7
pattern 12.3
representation 12.1
athletic facility 12
fun 12
people 11.7
space 11.6
male 11.3
envelope 11.2
man 10.7
billboard 10.7
web site 10.7
poster 10.4
sport 10.3
facility 10.3
screen 10.1
rough 10
border 9.9
text 9.6
grungy 9.5
structure 9.2
power 9.2
symbol 8.7
messy 8.7
paper 8.7
ink 8.7
men 8.6
floral 8.5
flower 8.5
wallpaper 8.4
summer 8.4
freedom 8.2
team 8.1
material 8
signboard 7.9
noise 7.8
color 7.8
silhouettes 7.8
play 7.8
rust 7.7
edge 7.7
stain 7.7
dirt 7.6
outdoors 7.5
style 7.4
container 7.3
person 7.3
boy 7.2
negative 7.1
wall 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

drawing 93.6
person 91.8
monitor 90.3
text 88.9
sketch 86.2
cartoon 84
clothing 83.8
man 81.3
posing 72.6
old 68.1
gallery 66.3
black and white 62.4
different 47.2
vintage 35
same 26.1
picture frame 8.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 54.8%
Calm 70.2%
Disgusted 26.6%
Confused 1.3%
Surprised 0.6%
Fear 0.6%
Angry 0.4%
Sad 0.2%
Happy 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.6%
Calm 96.3%
Sad 2.7%
Angry 0.3%
Happy 0.2%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Bird 93.9%

Categories

Imagga

paintings art 84.1%
interior objects 12.6%
food drinks 1.6%