Human Generated Data

Title

Untitled (group portrait of a family posing before a painted backdrop in an outdoor setting)

Date

c. 1856 - c. 1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Andrew S. Dibner, P2003.131.12931

Human Generated Data

Title

Untitled (group portrait of a family posing before a painted backdrop in an outdoor setting)

People

Artist: Unidentified Artist,

Date

c. 1856 - c. 1910

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-24

People 99.9
Family 99.9
Human 99.9
Person 99.6
Person 99
Person 98.7
Person 97.9
Person 97.8
Person 97.3
Person 95.5
Person 82.8
Painting 77.9
Art 77.9

Imagga
created on 2022-01-24

kin 89.6
military uniform 64.6
uniform 61
clothing 42.3
man 30.9
covering 26.1
consumer goods 25.8
male 24.1
people 21.2
soldier 19.5
military 18.3
adult 16.9
person 15.9
statue 15.2
protection 14.5
army 13.6
weapon 13.6
war 13.5
old 13.2
commodity 12.8
danger 11.8
camouflage 11.8
portrait 11.6
gun 11.4
religion 10.7
outdoor 10.7
face 9.9
sport 9.9
sculpture 9.6
men 9.4
monument 9.3
human 9
suit 9
history 8.9
sky 8.9
family 8.9
battle 8.8
conflict 8.8
couple 8.7
antique 8.7
ancient 8.6
mask 8.6
helmet 8.6
culture 8.5
two 8.5
vintage 8.3
fun 8.2
girls 8.2
love 7.9
black 7.8
warrior 7.8
architecture 7.8
stone 7.6
fashion 7.5
happy 7.5
dark 7.5
leisure 7.5
outdoors 7.5
holding 7.4
training 7.4
historic 7.3
art 7.2
women 7.1
to 7.1
happiness 7
travel 7
together 7

Google
created on 2022-01-24

People 78.8
Hat 78.1
Vintage clothing 76.2
Suit 74
Toddler 73.9
Tree 71.2
Event 70.7
Classic 69.4
Art 66.3
History 65.8
Sitting 64.6
Picture frame 63.5
Stock photography 63
Baby 61.3
Plant 59.2
Retro style 58.3
Family reunion 56.7
Family 56.3

Microsoft
created on 2022-01-24

person 99.7
clothing 99.4
grass 98.7
old 96.2
human face 94.4
smile 92.1
text 91
man 89
black 85.8
posing 85.6
woman 82
white 67.3
group 65.2
photograph 61.7
vintage 40.5
team 35.6
crowd 0.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Female, 66.6%
Calm 99%
Sad 0.4%
Confused 0.2%
Fear 0.1%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 4-10
Gender Female, 97.1%
Calm 69.6%
Sad 17.1%
Fear 4%
Angry 2.5%
Disgusted 2.4%
Surprised 1.8%
Confused 1.7%
Happy 0.9%

AWS Rekognition

Age 18-24
Gender Male, 93.3%
Calm 99.5%
Sad 0.1%
Happy 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 39-47
Gender Male, 97%
Calm 91.3%
Happy 2.4%
Sad 1.3%
Surprised 1.3%
Angry 1.2%
Fear 1.1%
Confused 0.9%
Disgusted 0.5%

AWS Rekognition

Age 30-40
Gender Female, 98.5%
Surprised 67.9%
Calm 16.5%
Confused 12%
Angry 1%
Fear 0.9%
Sad 0.7%
Disgusted 0.6%
Happy 0.4%

AWS Rekognition

Age 0-6
Gender Female, 59.5%
Sad 55.9%
Fear 29.1%
Calm 6.4%
Angry 5.2%
Confused 1.8%
Happy 0.8%
Disgusted 0.5%
Surprised 0.4%

AWS Rekognition

Age 1-7
Gender Female, 100%
Sad 66.1%
Calm 26.8%
Angry 2.4%
Confused 1.8%
Fear 1%
Happy 0.8%
Surprised 0.6%
Disgusted 0.5%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 99.6%
Sad 0.1%
Confused 0.1%
Surprised 0.1%
Angry 0.1%
Happy 0%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 51
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Painting 77.9%

Captions

Microsoft

a vintage photo of Roman von Ungern-Sternberg et al. posing for the camera 98.3%
a vintage photo of Roman von Ungern-Sternberg et al. sitting posing for the camera 98%
a vintage photo of Roman von Ungern-Sternberg et al. posing for a picture 97.9%