Human Generated Data

Title

Untitled (couple posed sitting in front of Christmas tree looking at daughter)

Date

1961

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9814

Human Generated Data

Title

Untitled (couple posed sitting in front of Christmas tree looking at daughter)

People

Artist: Martin Schweig, American 20th century

Date

1961

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Person 99.2
Human 99.2
Person 98.8
Person 98.7
Clothing 95.1
Apparel 95.1
Accessories 93.3
Accessory 93.3
Tie 93.3
People 79.1
Footwear 73.4
Shoe 73.4
Face 71.7
Furniture 69.3
Chair 69.3
Portrait 68.1
Photo 68.1
Photography 68.1
Shorts 66.5
Clinic 63.4
Girl 59.1
Female 59.1
Shirt 57
Indoors 56.2
Coat 55.8
Flooring 55
Overcoat 55
Suit 55

Imagga
created on 2022-01-24

person 34.6
man 30.9
people 27.9
nurse 25.6
male 23.5
sport 23.1
businessman 19.4
adult 18.7
active 18.3
business 18.2
snow 15.8
winter 15.3
black 15
men 14.6
outdoor 13.8
success 12.9
exercise 12.7
professional 12.2
negative 11.9
patient 11.9
silhouette 11.6
film 11.4
portrait 11
athlete 10.9
team 10.8
fashion 10.6
office 10.4
standing 10.4
walking 10.4
corporate 10.3
manager 10.2
work 10.2
action 10.2
lifestyle 10.1
world 10.1
happy 10
suit 9.9
sick person 9.8
job 9.7
outdoors 9.7
group 9.7
case 9.6
boy 9.6
planner 9.4
life 9.2
pose 9.1
fitness 9
fun 9
worker 8.9
cold 8.6
motion 8.6
casual 8.5
summer 8.4
leisure 8.3
child 8.2
businesswoman 8.2
ice 8.1
activity 8.1
sand 8.1
runner 7.8
finance 7.6
beach 7.6
photographic paper 7.4
freedom 7.3
looking 7.2
happiness 7.1

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

text 99.4
clothing 89.5
person 86.3
black and white 76.9

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 67.6%
Calm 76.1%
Surprised 11.8%
Happy 6.8%
Angry 2.1%
Disgusted 1.2%
Sad 1.1%
Confused 0.6%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Tie 93.3%
Shoe 73.4%
Chair 69.3%

Captions

Microsoft

a man jumping in the air 40.2%
an old photo of a man 40.1%
a close up of a man jumping in the air 32%

Text analysis

Amazon

22335

Google

5
.........
3
2
MJI7--YT33A°2--A
......... 2 2 3 3 5 MJI7--YT33A°2--A