Human Generated Data

Title

Untitled (man and woman seated on motor bike)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2395

Human Generated Data

Title

Untitled (man and woman seated on motor bike)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.7
Human 99.7
Person 99.7
Machine 96.4
Wheel 96.4
Automobile 96.1
Vehicle 96.1
Car 96.1
Transportation 96.1
Motorcycle 91.3
Truck 88
Hair 81.2
Tricycle 57.9
Clothing 56.9
Shorts 56.9
Apparel 56.9
Wheel 54.3

Imagga
created on 2022-01-30

person 27.2
man 22.2
planner 20.9
people 18.9
helmet 17.7
football helmet 16.1
astronaut 15.7
sport 15
adult 14.9
lifestyle 14.4
newspaper 13.7
male 13.5
body 12.8
black 12.6
equipment 12.5
active 12.4
sky 11.5
fun 11.2
product 10.8
fashion 10.5
style 10.4
headdress 10.2
clothing 10.2
model 10.1
dark 9.2
modern 9.1
exercise 9.1
outdoors 8.9
device 8.8
creation 8.7
motion 8.6
portrait 8.4
automaton 8.3
transport 8.2
activity 8.1
play 7.8
travel 7.7
men 7.7
attractive 7.7
jump 7.7
vehicle 7.7
outdoor 7.6
costume 7.6
power 7.6
one 7.5
art 7.4
action 7.4
water 7.3
celebration 7.2
transportation 7.2
summer 7.1
conceptual 7

Microsoft
created on 2022-01-30

text 99.4
land vehicle 98.9
wheel 98.4
vehicle 97.7
outdoor 97.1
tire 92.4
car 84.5
motorcycle 77.8
auto part 73.9
black and white 67.3
old 56.7
person 55.4

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 90.4%
Happy 44.1%
Calm 32.3%
Sad 17.3%
Angry 2.2%
Surprised 2.1%
Disgusted 1%
Confused 0.5%
Fear 0.5%

AWS Rekognition

Age 36-44
Gender Female, 96.2%
Calm 95%
Happy 4.5%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Sad 0.1%
Confused 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.7%
Wheel 96.4%
Car 96.1%
Motorcycle 91.3%
Truck 88%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 84.8%
a vintage photo of a group of people posing for a picture 84.7%
a vintage photo of a man 84.6%

Text analysis

Amazon

YT37A2-XAOOX

Google

YT3FA2-XAGO
YT3FA2-XAGO